Mastering AWS CLI: A Step-by-Step Guide to IAM, S3, Snapshots, and More

Mastering AWS CLI: A Step-by-Step Guide to IAM, S3, Snapshots, and More

Day 42 & 43

IAM Programmatic Access

IAM (Identity and Access Management) programmatic access is a critical component of modern IT and cybersecurity strategies. It refers to the ability of users and applications to interact with cloud services, APIs (Application Programming Interfaces), and various resources within an organization's infrastructure programmatically. Programmatic access allows for automation, orchestration, and the development of applications that can securely interact with cloud services and data.

Here are some key aspects and considerations related to IAM programmatic access:

  1. Authentication and Authorization: IAM programmatic access involves both authentication and authorization. Authentication verifies the identity of a user or application, ensuring that they are who they claim to be. Authorization determines what actions or resources they are allowed to access once their identity is established.

  2. Access Keys and Tokens: To enable programmatic access, users and applications are typically provided with access keys or tokens. These are essentially credentials that are used to authenticate and gain access to resources. Access keys can be long-lived (permanent) or short-lived (temporary), depending on the security requirements and use case.

  3. Secure Storage and Handling: Access keys and tokens must be stored securely. For example, they should not be hard-coded into application source code or publicly accessible repositories. Instead, they should be stored in a secure secrets manager or vault. Developers must also follow best practices for secure handling, like rotating keys and minimizing exposure.

  4. Least Privilege Principle: IAM programmatic access should adhere to the principle of least privilege, which means granting only the permissions necessary for a user or application to perform its specific tasks. This reduces the potential for misuse or accidental data breaches.

  5. API Calls and SDKs: Programmatic access often involves making API calls to cloud services. Cloud providers typically offer software development kits (SDKs) and libraries that simplify the process of integrating with their services. These SDKs help developers manage authentication, authorization, and error handling.

  6. Monitoring and Auditing: IAM programmatic access activities should be closely monitored and audited. This includes tracking who is accessing resources, what actions they are performing, and when these actions occur. Cloud providers often offer tools and services for logging and monitoring these activities.

  7. Multi-Factor Authentication (MFA): Enforcing MFA for programmatic access can significantly enhance security. MFA requires users or applications to provide two or more forms of verification before granting access, adding an extra layer of protection against unauthorized access.

  8. Access Policies: Access control policies define what resources a user or application can access and what actions they can perform. These policies should be well-defined, regularly reviewed, and properly enforced to maintain a secure environment.

  9. Role-Based Access Control (RBAC): RBAC is a common approach to managing programmatic access. It involves assigning roles to users or applications and associating permissions with those roles. This simplifies access management and helps maintain a consistent access control model.

  10. Compliance and Regulations: Depending on the industry and location, there may be specific compliance requirements and regulations governing programmatic access. Organizations must ensure that their IAM practices align with these regulations to avoid legal and financial consequences.

IAM programmatic access is a fundamental aspect of cloud computing, DevOps, and modern software development. When implemented correctly, it not only enhances security but also enables organizations to build efficient and scalable systems that can adapt to the dynamic needs of the digital landscape. However, it requires careful planning, continuous monitoring, and ongoing maintenance to remain effective in safeguarding an organization's digital assets.

AWS CLI

The AWS Command Line Interface (AWS CLI) is a powerful tool provided by Amazon Web Services (AWS) that allows users to interact with AWS services and resources from the command line of their local terminal or script. It offers a command-line interface to manage and configure AWS services, making it an essential tool for developers, administrators, and DevOps professionals working with AWS.

Here are some key aspects and features of the AWS CLI:

  1. Installation and Configuration: To start using the AWS CLI, users need to install it on their local system. Once installed, they configure it with their AWS access credentials, which typically include an Access Key ID and Secret Access Key. These credentials grant the necessary permissions to interact with AWS resources securely.

  2. Cross-Platform Compatibility: The AWS CLI is designed to work on various operating systems, including Windows, macOS, and Linux, making it accessible to a wide range of users.

  3. Versatility: AWS CLI provides a vast set of commands, allowing users to manage AWS services such as EC2 (Elastic Compute Cloud), S3 (Simple Storage Service), IAM (Identity and Access Management), Lambda, and many others. Users can perform tasks like creating and managing instances, configuring security groups, uploading and downloading files, and more.

  4. Scripting and Automation: AWS CLI is invaluable for scripting and automation tasks. Users can write scripts to automate infrastructure provisioning, perform routine maintenance, and respond to events. It integrates seamlessly with popular scripting languages like Python and Bash.

  5. Output Formatting: Users can customize the output of AWS CLI commands to various formats, including JSON, text, and table, making it easier to parse and use the output in scripts or other tools.

  6. AWS Profiles: AWS CLI supports multiple profiles, enabling users to switch between different AWS accounts or roles easily. This is particularly useful for individuals or teams managing multiple AWS environments.

  7. Interactive Mode: AWS CLI offers an interactive mode (via aws configure or aws configure set cli_follow_urlparam true) that can simplify the process of issuing commands by providing interactive prompts for input.

  8. Plugins and Extensions: AWS CLI can be extended with plugins to add functionality beyond the core AWS services. These plugins can be developed by AWS or the community to address specific use cases or services.

  9. Version Compatibility: AWS CLI maintains version compatibility, ensuring that scripts and automation developed with one version will continue to work with newer versions, reducing the risk of breaking changes.

  10. Open Source: The AWS CLI is open source and actively maintained on GitHub, allowing the community to contribute improvements, report issues, and propose new features.

  11. Cost Estimation: Some AWS CLI commands provide cost estimation features, helping users predict the potential cost of specific AWS resources or configurations.

  12. Logging and Debugging: AWS CLI provides options for logging and debugging, which can be helpful when diagnosing issues or errors in scripts or commands.

Task 1: Create AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY from AWS Console

  1. Navigate to the IAM dashboard and create a user.

  2. Navigate to the test user and click on the Security Credentials Tab

  3. Under Security Credentials, click on Create Access Key

  4. You will see the Access Key and Secret Access Key on the Screen, make sure you don't share it with anyone. Copy both or click on Download .csv file and save it to the secure location.

Task 2: Setup and install AWS CLI and configure your account credentials

  1. To install it on your system you can refer to the AWS documentation here

  2. I have already created an ec2 instance of type ubuntu and installed the aws cli using sudo apt install -y awscli

  3. To configure the AWS cli with our account we need to type aws configure and it will ask for the Access and Secret Access Key, just pass in both and rest leave the default, hit enter and it will be done. I alredy configured it and when type aws configure and hit enter it will displays it.

AWS S3 (Simple Storage Service)

Amazon Simple Storage Service (Amazon S3) is a highly versatile and scalable object storage service provided by Amazon Web Services (AWS). It offers a wide range of features and capabilities that make it a popular choice for businesses of all sizes and across various industries. Let's explore the key aspects and topics related to Amazon S3:

Features of Amazon S3: Amazon S3 offers a rich set of features, making it a versatile storage solution for a variety of use cases:

  • Storage Classes: Amazon S3 provides multiple storage classes, including S3 Standard, S3 Standard-IA, S3 One Zone-IA, S3 Glacier Instant Retrieval, S3 Glacier Flexible Retrieval, S3 Glacier Deep Archive, and S3 Intelligent-Tiering. Each class is designed to cater to different access patterns and cost considerations.

  • Storage Management: S3 offers tools like S3 Lifecycle, which allows you to automate data transitions between storage classes, and S3 Object Lock, which adds an extra layer of protection against object changes and deletions.

  • Data Replication: S3 Replication enables you to replicate objects and their metadata to different AWS Regions or within the same Region, ensuring data availability and compliance.

  • Batch Operations: S3 Batch Operations enables you to manage large numbers of objects at scale using a single API request or console operation, simplifying tasks like copying, invoking AWS Lambda functions, and restoring objects.

Access Management and Security: Amazon S3 places a strong emphasis on access control and security:

  • S3 Block Public Access: By default, S3 buckets and objects are private. Block Public Access settings ensure that public access is restricted at the bucket level, adding an additional layer of security.

  • AWS Identity and Access Management (IAM): IAM allows you to securely control access to your S3 resources. You can centrally manage permissions, specifying who can access what resources within your AWS account.

  • Bucket Policies: Use IAM-based policy language to define resource-based permissions for your S3 buckets and objects.

  • S3 Access Points: Access points provide named network endpoints with dedicated access policies, simplifying data access management for shared datasets.

  • Access Control Lists (ACLs): While less recommended, ACLs allow you to grant read and write permissions for individual buckets and objects to authorized users.

  • S3 Object Ownership: You can take ownership of every object in your bucket, simplifying access management for data stored in Amazon S3.

  • IAM Access Analyzer for S3: This tool helps evaluate and monitor S3 bucket access policies to ensure they provide only the intended access to your resources.

Data Processing: Amazon S3 supports data processing and workflow automation:

  • S3 Object Lambda: Allows you to add custom code to S3 GET, HEAD, and LIST requests to modify and process data before it's returned to an application.

  • Event Notifications: Trigger workflows using services like Amazon SNS, Amazon SQS, and AWS Lambda when changes are made to your S3 resources.

Storage Logging and Monitoring: Amazon S3 provides tools for monitoring and controlling resource usage:

  • Amazon CloudWatch Metrics: Track the operational health of your S3 resources and set up billing alerts.

  • AWS CloudTrail: Record actions taken by users or services in Amazon S3, providing detailed API tracking for bucket and object operations.

  • Server Access Logging: Get detailed records of requests made to a bucket, useful for security audits and understanding usage.

  • AWS Trusted Advisor: Evaluate your AWS account to optimize infrastructure, improve security and performance, reduce costs, and monitor service quotas.

Analytics and Insights: Amazon S3 helps you gain visibility into storage usage:

  • Amazon S3 Storage Lens: Provides usage and activity metrics, interactive dashboards, and insights for your entire organization, specific accounts, AWS Regions, buckets, or prefixes.

  • Storage Class Analysis: Analyze storage access patterns to decide when to move data to more cost-effective storage classes.

Strong Consistency: Amazon S3 offers strong read-after-write consistency for PUT and DELETE requests, ensuring that you retrieve the correct data even after updates.

How Amazon S3 Works: Amazon S3 operates by storing data as objects within containers called buckets. Each object includes data and metadata, and it is uniquely identified by a key within a bucket. Amazon S3 is organized hierarchically, allowing you to structure data within buckets and retrieve objects via unique URLs.

Buckets: Buckets serve as containers for objects and organize the Amazon S3 namespace. They are region-specific and have unique names.

Objects: Objects are the fundamental entities stored in Amazon S3. They consist of data, metadata, and a unique key. Objects are addressed using a combination of bucket name, key, and optionally, version ID.

S3 Versioning: S3 supports versioning, allowing you to keep multiple versions of objects within the same bucket for data protection and recovery.

Keys: Object keys are unique identifiers within a bucket. The combination of bucket, key, and version (if enabled) uniquely identifies each object.

Related Services: Amazon S3 integrates seamlessly with other AWS services, including Amazon EC2 for computing capacity, Amazon EMR for data processing, AWS Snow Family for offline data transfer, AWS Transfer Family for secure file transfers, and more.

Accessing Amazon S3: You can interact with Amazon S3 through various methods:

  • AWS Management Console: A web-based

interface for managing S3 and other AWS resources.

  • AWS Command Line Interface (CLI): A set of command-line tools for performing AWS tasks.

  • AWS SDKs: Software development kits for various programming languages, simplifying programmatic access to S3.

  • Amazon S3 REST API: A RESTful HTTP interface for programmatic access, suitable for building custom applications.

Paying for Amazon S3: Amazon S3 follows a pay-as-you-go pricing model, where you are charged only for the storage and data transfer you use. There are no upfront fees, and you can take advantage of the AWS free tier for new customers.

PCI DSS Compliance: Amazon S3 is compliant with the Payment Card Industry Data Security Standard (PCI DSS), making it suitable for processing, storing, and transmitting credit card data securely.

Task 1

Step 1: Create an S3 bucket and upload a file to it using the AWS Management Console.

  1. Navigate to the S3 console and click on Create Bucket.

  2. Give it a name that should be globally unique.

  3. Rest leave the default and hit create bucket. After creation it will appear like this.

  4. Let's navigate into it and upload any file into it. Currently it has nothing.

  5. Adding a file to it, click on upload or drag and drop the file and hit upload

Step 2: Access the file from the EC2 instance using the AWS Command Line Interface (AWS CLI).

  1. Connect to your instance using SSH

  2. To list the bucket type aws s3 ls

  3. To Download the file using the CLI you need to use the following command aws s3 cp s3://bucket-name/file.txt . make sure to replace the bucket-name with your actual bucket name and file.txt with your file name and . in the last represents the current directory

  4. Do ls and check the file and use the cat command to view the content.

Task 2

Step 1 : Create a snapshot of the EC2 instance and use it to launch a new EC2 instance.

  1. Navigate to the EC2 console and on the left look for Snapshot under Elastic Block Store and click on it. Click on Create Snapshot and click on Instance and select your instance as shown below

  2. Leave everything else as defaults and click on create, now it will be in a pending state and will take a few seconds to be functional.

  3. To launch an instance from it, first we need to create an image from it which is also called as AMI. Navigate to the Snapshots, select it and on the top right corner click on actions and click on create an image from snapshot

  4. Give it a name and rest leave default and hit create

  5. Now Navigate to the Images block and click on AMI under it. It will display our Image

  6. Now select the image and click on launch Instance from AMI on the top right, give it a name, select the key and hit launch instance.

  7. Connect to it using the SSH

    Step 2: Verify the file and content in it and it is there.

    Make sure to delete the snapshot, ami and instance as they have a cost.

Here are some commonly used AWS CLI commands for Amazon S3:

aws s3 ls - This command lists all of the S3 buckets in your AWS account.

aws s3 mb s3://bucket-name - This command creates a new S3 bucket with the specified name.

aws s3 rb s3://bucket-name - This command deletes the specified S3 bucket.

aws s3 cp file.txt s3://bucket-name - This command uploads a file to an S3 bucket.

aws s3 cp s3://bucket-name/file.txt . - This command downloads a file from an S3 bucket to your local file system.

aws s3 sync local-folder s3://bucket-name - This command syncs the contents of a local folder with an S3 bucket.

aws s3 ls s3://bucket-name - This command lists the objects in an S3 bucket.

aws s3 rm s3://bucket-name/file.txt - This command deletes an object from an S3 bucket.

aws s3 presign s3://bucket-name/file.txt - This command generates a pre-signed URL for an S3 object, which can be used to grant temporary access to the object.

aws s3api list-buckets - This command retrieves a list of all S3 buckets in your AWS account, using the S3 API.