- Hands-On Artificial Intelligence on Amazon Web Services
- Subhashini Tripuraneni Charles Song
- 863字
- 2021-06-24 12:48:42
Working with S3
The Amazon S3 service is one of the first services that was offered by AWS. S3 provides secure, durable, and scalable object storage at a very low cost. Object storage just means that the things you store in S3 are accessible at the file level, instead of at the block or byte level. S3 is a very flexible service, with many usage patterns. You can read more about Amazon S3 at https://aws.amazon.com/s3.
Let's start working with Amazon S3 by creating a bucket. You can think of a bucket as a folder that can hold an unlimited number of files (objects).
Navigate to the Amazon S3 home page from the Amazon Management Console by clicking on the Services tab in the top-left corner, and then click or search for S3 under Storage. If this is your first time using S3, you will see a screen similar to this:

In this book, we will be leveraging S3 a lot in our hands-on projects. We will be using S3 for three main purposes. The first purpose is storing media files and other contents for other AWS services to access. Many of the AWS AI services are tightly integrated with S3; for example, this is the pattern that we have seen in the Rekognition demo. The second purpose is hosting entire static websites with S3, including HTML files, images, videos, and client-side JavaScript. This gives us the ability to host interactive web applications without the need for traditional web servers. The third purpose is using S3 as a data store for data collection, processing, and analytics tasks when we train our custom ML models.
Click on the Create bucket button to create a new bucket:

The first screen in the model asks for three pieces of information: the Bucket name, the Region, and Copy settings from an existing bucket. Since this is your first bucket, we can ignore the third piece of information.
The S3 bucket name must be globally unique. This means that every bucket that's ever created by you and others must have a unique name. Coming up with a globally unique bucket name can be challenging; you cannot expect bucket names such as contents, website, or data to still be available. The S3 bucket names must be DNS-compliant so that you can follow similar patterns for the domain names. For example, if we choose aws.ai as our root domain, we can create buckets such as contents.aws.ai, website.aws.ai, and data.aws.ai to avoid conflicts. Think about which root domain you would like to use.
You must also specify the region of your bucket. This will determine in which physical region in the world your objects will be stored. The AWS regions are completely isolated from each other by design. Objects stored in one region cannot be accessed by services and applications running in a different region. This can be important if your line of business has high-performance requirements that need your applications and data to be located closer to your customers. This can also be important if your line of business must comply with industry and government regulations that require your applications and data to be located within a certain geographic location.
For the projects in this book, we do not have either of these concerns. Therefore, for consistency, let's pick the US East (N. Virginia) region again.
Here's what your S3 buckets page might look like after creation, but of course, with different bucket names:

Once you have created the S3 buckets, go ahead and click on the contents.aws.ai bucket. You will see a screen similar to this:

On this screen, you will be able to upload files to the bucket, configure bucket properties, set access permissions, and perform advanced settings such as life cycle rules and cross-region replication. We will come back to some of these settings later on, but for now, upload one or more photos that you want to analyze with the Rekognition service. You can click on the Upload button or simply drag and drop the photos to this page to upload them. We can leave all the file settings as default for now.
Congratulations, you just stored some files in the AWS cloud platform with 99.999999999% durability and 99.99% availability! In other words, if you store 10,000 files in S3, statistically, you would lose one file every 10 million years, and all of the files are available for your application to access 525,547.4 minutes out of 525,600 minutes every year.
- 大數據戰爭:人工智能時代不能不說的事
- Mastering Spark for Data Science
- Ansible Quick Start Guide
- 大數據改變世界
- 21天學通Visual Basic
- Maya極速引擎:材質篇
- RPA(機器人流程自動化)快速入門:基于Blue Prism
- 運動控制系統應用與實踐
- 統計挖掘與機器學習:大數據預測建模和分析技術(原書第3版)
- 經典Java EE企業應用實戰
- Photoshop CS4數碼照片處理入門、進階與提高
- 人工智能:智能人機交互
- 基于Proteus的PIC單片機C語言程序設計與仿真
- 渲染王3ds Max三維特效動畫技術
- Getting Started with Tableau 2019.2