Monday, October 31, 2016

AWS Week in Review – October 24, 2016

Another busy week in AWS-land! Today's post included submissions from 21 internal and external contributors, along with material from my RSS feeds, my inbox, and other things that come my way. To join in the fun, create (or find) some awesome AWS-related content and submit a pull request!

















































Monday

October 24




Tuesday

October 25




Wednesday

October 26




Thursday

October 27




Friday

October 28




Saturday

October 29




Sunday

October 30



New & Notable Open Source



  • aws-git-backed-static-website is a Git-backed static website generator powered entirely by AWS.

  • rds-pgbadger fetches log files from an Amazon RDS for PostgreSQL instance and generates a beautiful pgBadger report.

  • aws-lambda-redshift-copy is an AWS Lambda function that automates the copy command in Redshift.

  • VarnishAutoScalingCluster contains code and instructions for setting up a shared, horizontally scalable Varnish cluster that scales up and down using Auto Scaling groups.

  • aws-base-setup contains starter templates for developing AWS CloudFormation-based AWS stacks.

  • terraform_f5 contains Terraform scripts to instantiate a Big IP in AWS.

  • claudia-bot-builder creates chat bots for Facebook, Slack, Skype, Telegram, GroupMe, Kik, and Twilio and deploys them to AWS Lambda in minutes.

  • aws-iam-ssh-auth is a set of scripts used to authenticate users connecting to EC2 via SSH with IAM.

  • go-serverless sets up a go.cd server for serverless application deployment in AWS.

  • awsq is a helper script to run batch jobs on AWS using SQS.

  • respawn generates CloudFormation templates from YAML specifications.


New SlideShare Presentations



New Customer Success Stories



  • AlbemaTV – AbemaTV is an Internet media-services company that operates one of Japan's leading streaming platforms, FRESH! by AbemaTV. The company built its microservices platform on Amazon EC2 Container Service and uses an Amazon Aurora data store for its write-intensive microservices-such as timelines and chat-and a MySQL database on Amazon RDS for the remaining microservices APIs. By using AWS, AbemaTV has been able to quickly deploy its new platform at scale with minimal engineering effort.

  • Celgene – Celgene uses AWS to enable secure collaboration between internal and external researchers, allow individual scientists to launch hundreds of compute nodes, and reduce the time it takes to do computational jobs from weeks or months to less than a day. Celgene is a global biopharmaceutical company that creates drugs that fight cancer and other diseases and disorders. Celgene runs its high-performance computing research clusters, as well as its research collaboration environment, on AWS.

  • Under Armour – Under Armour can scale its Connected Fitness apps to meet the demands of more than 180 million global users, innovate and deliver new products and features more quickly, and expand internationally by taking advantage of the reliability and high availability of AWS. The company is a global leader in performance footwear, apparel, and equipment. Under Armour runs its growing Connected Fitness app platform on the AWS Cloud.


New YouTube Videos



Upcoming Events



Help Wanted



Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.



AWS Hot Startups – October 2016 – Optimizely, Touch Surgery, WittyFeed

I'm pleased to share yet another month of hot startups, courtesy of Tina Barr!

-
Jeff;


Check out this month's AWS-powered startups:



  • Optimizely – Providing web and mobile A/B testing for the world's leading brands.

  • Touch Surgery – Building technologies for the global surgical community.

  • WittyFeed – Creating viral content.


Optimizely (San Francisco)
Optimizely is one of the world's leading experience optimization platforms, providing website and mobile A/B testing and personalization for leading brands. The platform's easy-to-use products coupled with the high speed of deployment allows organizations to run experiments and make data-driven decisions. Thousands of customers worldwide are using Optimizely to deliver better experiences to their audiences across a variety of channels. To date, those customers have created and delivered over 275 billion optimized visitor experiences.


In today's digital world, it is essential for companies to provide personalized experiences for their consumers. In Optimizely's words, “Being personal is no longer optional.” Their web personalization products allow companies to deliver targeted content to their customers in real time by using browsing behavior, demographic information, contextual clues, and 1st- and 3rd-party data. This in turn drives revenue and keeps customers coming back. Companies don't have to rely on a team of analysts or engineers to know the impact of their campaigns either – Optimizely has developed an industry-leading Stats Engine to support experimentation and decision-making on their terms.


Optimizely relies heavily on AWS for hosting core parts of its data infrastructure that powers experimentation across different channels, targeting and personalization. They use services like Amazon S3, Amazon EC2, Amazon EMR, Amazon RDS, Amazon Redshift, Amazon DynamoDB, and Amazon ElastiCache to rapidly and reliably scale out infrastructure for supporting business growth. Optimizely also uses AWS security services such as Amazon Identity and Access Management (IAM) roles, Amazon Virtual Private Cloud (VPC), and AWS CloudTrail for auditing in order to enhance customer trust in the product.


Check out the Optimizely blog to keep up with the latest news!


Touch Surgery (London)
Founded by 4 surgeons, Touch Surgery is a technology startup with a mission to scale global surgery by using the latest technologies to train surgeons and standardize surgical knowledge. The company is transforming professional healthcare training through the delivery of a unique platform that links mobile apps with a powerful data back-end. The team behind Touch Surgery has spent years working with leading surgical minds from around the world to map and share the language of surgery.


Touch Surgery uses cognitive mapping techniques coupled with cutting edge AI and 3D rendering technology to codify surgical procedures. They have partnered with leaders in Virtual Reality and Augmented Reality to work toward a vision of advancing surgical care in the operating room (OR). Future surgeons can practice anytime, anywhere by downloading the app (iOS & Android). The app allows users to learn and practice over 50 surgical procedures, evaluate and measure progress, and connect with physicians across the world. Touch Surgery also offers a variety of simulations in specialties such as neurosurgery, orthopedics, plastics, and more.


Touch Surgery's back-end is fully hosted on AWS. They make use of many Amazon EC2 instances, as well as Amazon S3 and Amazon Elasticsearch Service. AWS has allowed them to scale with the use of Amazon RDS, and they now have over 1 million users and are recording vast amounts of usage data to power their data analytics product. Touch Surgery is also able to deploy different environments with relative ease to help with testing and increase the speed of delivery.


Touch Surgery is always looking for more talent. For more info check out https://www.touchsurgery.com/jobs/.


WittyFeed (India)
WittyFeed is a platform connecting social media influencers with creative writers who have a passion for sharing entertaining and attention-grabbing stories. Launched in 2014 by cofounders Vinay Singhal (CEO), Shahshak Vaishnav (CTO), and Parveen Singhal (COO), WittyFeed has become India's largest viral content company generating over 60 stories per day and attracting 75 million unique visitors every month. They have over 100 writers that are constantly producing content, and an in-house team of editors ensures that the content goes viral and reaches the right audience. The WittyFeed team is passionate about helping publishers grow their pages and is currently creating tools that will tell them not only when to post, but the type of content that will reach a broad audience.


WittyFeed is easy to use and allows readers to personalize their feeds to get the most relevant content. Their focus is on photo-stories, 'listicles', and 'charticles' in categories such as Technology, History, Sports, and of course, Hilarious. WittyFeed also makes it easy for users to engage and interact with each other with their new live-commenting feature, which allows readers to leave comments without having to sign into a personal account.


Being a content company means that WittyFeed has to deal with a heavy load, user base, and data. With AWS they are able to have a highly scalable infrastructure that can handle sudden surges and adapts to needs in a short period of time. WittyFeed uses a broad range of services including Amazon Kinesis Firehouse, Amazon Redshift, and Amazon RDS. These services allow WittyFeed to handle over 100 million visitors and serve over 500 million page views with almost zero downtime and have become the central pillars of their infrastructure.


Start exploring WittyFeed today!


- Tina Barr



Sunday, October 30, 2016

Get $25 Back with $50 Luxury Beauty Purchase!

Calling all beauty aficionados!  Indulge in a wide range of Luxury Beauty items and get a $25 cash back if you spend $50.  All you need to do is enter the promo code LUXPRIME25 at check out and you get the $25 credit automatically applied towards your future Luxury Beauty or Professional Beauty … Continue reading Get $25 Back with $50 Luxury Beauty Purchase!

Friday, October 28, 2016

Amazon's Alexa Officially Arrives on the Fire Tablets

Last month, Amazon announced that its voice-controlled virtual assistant will be available soon on the Fire tablets. Today, the software update is finally here so you don't need an Echo, Tap or Dot to be able to access Alexa. If you have the $49 Fire Tablet, Fire HD 8 and Fire HD 10, … Continue reading Amazon's Alexa Officially Arrives on the Fire Tablets

Wednesday, October 26, 2016

Splurge vs Save on Holiday Toys This Year

The biggest toy sellers have already released their Holiday Toy List for this year and by looking at the selection of toys, a lot of them are very high-tech and could surely encourage your child's imagination, creativity, motor and problem solving skills.  However, most of them come with hefty price tags. I'm not … Continue reading Splurge vs Save on Holiday Toys This Year

New Utility – Opt-in to Longer Resource IDs Across All Regions

Early this year I announced that Longer EC2 Resource IDs are Now Available, and kicked off the start of a transition period that will last until early December 2016. During the transition period, you can opt in to the new resource format on a region-by-region, user-by-user basis. At the conclusion of the transition period, all newly created resources will be assigned 17-character identifiers. Here are some important dates for your calendar:



  • November – Beginning on November 1st, you can use the describe-id-format command to check on the cutover deadline for the regions that are of interest to you.

  • December – Between December 5th and December 16th, we will be setting individual AWS Regions to use 17-character identifiers by default.


In order to help you to ensure that your code and your tools can handle the new format, I'd like to personally encourage you to opt in as soon as possible!


We've launched a new longer-ID-converter tool that will allow you to opt in, opt out, or simply check the status. If you have already installed the AWS Command Line Interface (CLI), you can simply download, the script, make it executable, and then run it:


$ wget https://raw.githubusercontent.com/awslabs/ec2-migrate-longer-id/master/migratelongerids.py
$ chmod +x migratelongerids.py

Here are some of the things that you can do.


Check the status of your account:


$ ./migratelongerids.py --status

Convert account, IAM Roles, and IAM Users to long IDs:


$ ./migratelongerids.py

Revert to short IDs:


$ ./migratelongerids.py --revert

Convert the current User/Role:


$ ./migratelongerids.py --convertself

For more information on this utility, check out the README file. For more information on the move to longer resource IDs, consult the EC2 FAQ.

-
Jeff;


Tuesday, October 25, 2016

Get Your First Dash Button for FREE!

It's not the end for Amazon Dash just yet.  In fact, there's an exponential growth for the program and orders have increased 5x compared to last year.  It also got launched recently in Austria, Germany and the UK.  The significant increase in demand brought Amazon to add 60 new Dash buttons plus a … Continue reading Get Your First Dash Button for FREE!

Monday, October 24, 2016

Last Minute Halloween Shopping!

Hey, it's already the end of October!  What?!  If you are one of those busy adults who had no time to think of a costume or plan some spooky and fun activities for Halloween, then this post might just help you find all your trick-or-treat essentials. As for costumes, here are the most … Continue reading Last Minute Halloween Shopping!

Welcoming Adrian Cockcroft to the AWS Team.



I am excited that Adrian Cockcroft will be joining AWS as VP of Cloud Architecture. Adrian has played a crucial role in developing the cloud ecosystem as Cloud Architect at Netflix and later as a Technology Fellow at Battery Ventures. Prior to this, he held positions as Distinguished Engineer at eBay and Sun Microsystems. One theme that has been consistent throughout his career is that Adrian has a gift for seeing the bigger engineering picture.



At Netflix, Adrian played a key role in the company's much-discussed migration to a "cloud native" architecture, and the open sourcing of the widely used (and award-winning) NetflixOSS platform. AWS customers around the world are building more scalable, reliable, efficient and well-performing systems thanks to Adrian and the Netflix OSS effort.



Combine Adrian's big thinking with his excellent educational skills, and you understand why Adrian deserves the respect he receives around the world for helping others be successful on AWS. I'd like to share a few Adrian's own words about his decision to join us....



"After working closely with many folks at AWS over the last seven years, I am thrilled to be joining the clear leader in cloud computing.The state of the art in infrastructure, software packages, and services is nowadays a combination of AWS and open source tools. -- and they are available to everyone. This democratization of access to technology levels the playing field, and means anyone can learn and compete to be the best there is."



I am excited about welcoming Adrian to the AWS team where he will work closely with AWS executives and product groups and consult with customers on their cloud architectures -- from start-ups that were born in the cloud to large web-scale companies and enterprises that have an “all-in” migration strategy. Adrian will also spend time engaging with developers in the Amazon-sponsored and supported open source communities. I am looking really looking forward to working with Adrian again and seeing the positive impact he will have on AWS customers around the world.

First AWS Certification Study Guide Now Available

My colleague Joe Baron wrote the guest post below to introduce you to a book that he and his colleagues have put together!

-
Jeff;


Are you studying for the AWS Certified Solutions Architect – Associate exam?


The new AWS Certified Solutions Architect Official Study Guide: Associate Exam has just been published by John Wiley & Sons, Inc., and is now available on Amazon.com in both paperback and Kindle format. The 455-page book helps prepare candidates for the AWS Certified Solutions Architect – Associate Exam. It was written by a very experienced team of subject matter experts, all part of the team that wrote, reviewed, and developed the AWS Certified Solutions Architect – Associate exam. The study guide includes an introduction to AWS, chapters on core AWS services, as well as information on AWS security, compliance, and architectural best practices. Each chapter includes targeted information on the topic, as well as key exam essentials, exercises, and chapter review questions (with answers in the appendix). The guide also gives you access to SYBEX online study tools such as practice exams, flashcards, chapter tests and assessment tests.


In addition to the new book, we have a half-day workshop to help you prepare for the exam. In the AWS Certification Exam Readiness Workshop: AWS Certified Solutions Architect – Associate, we review what to expect at the testing center and while taking the exam. We walk you through how the exam is structured, as well as teach you how to interpret the concepts being tested so that you can better eliminate incorrect responses.  You will also have the chance to test concepts we cover through a series of practice exam questions.  At the end of the class, you will receive a voucher to take an online practice exam at no cost.


If you will be attending AWS re:Invent this year, you can purchase a study guide now so that you can prepare to take the Solutions Architect – Associate exam on-site (reserve your seat now).


Joe Baron, Principal Solutions Architect



Sunday, October 23, 2016

Dear Apple, Why Did You Have to KILL the Headphone Jack?

As we all know, Apple finally dropped the headphones jack on their latest iPhone 7.  Unlike the previous models, the box will include a pair of EarPods with lighting connector and a Lightning to 3.5mm adaptor so you can still continue using your wired headphones – which makes me wonder why Apple would … Continue reading Dear Apple, Why Did You Have to KILL the Headphone Jack?

Friday, October 21, 2016

Congratulations to the Winners of the Serverless Chatbot Competition!

I announced the AWS Serverless Chatbot Competion in August and invited you to build a chatbot for Slack using AWS Lambda and Amazon API Gateway.


Last week I sat down with fellow judges Tim Wagner (General Manager of AWS Lambda) and Cecilia Deng (a Software Development Engineer on Tim's team) to watch the videos and to evaluate all 62 submissions. We were impressed by the functionality and diversity of the entrees, as well as the efforts that the entrants put in to producing attractive videos to show their submissions in action.


After hours of intense deliberation we chose a total of 9 winners: 8 from individuals, teams & small organizations and one from a larger organization. Without further ado, here you go:


Individuals, Teams, and Small Organizations
Here are the winners of the Serverless Slackbot Hero Award. Each winner receives one ticket to AWS re:Invent, access to discounted hotel room rates, public announcement and promotion during the Serverless Computing keynote, some cool swag, and $100 in AWS Credits. You can find the code for many of these bots on GitHub. In alphabetical order, the winners are:


AWS Network Helper“The goal of this project is to provide an AWS network troubleshooting script that runs on a serverless architecture, and can be interacted with via Slack as a chat bot.GitHub repo.


B0pb0t – “Making Mealtime Awesome.” GitHub repo.


Borges – “Borges is a real-time translator for multilingual Slack teams.” GitHub repo.


CLIve – “CLIve makes managing your AWS EC2 instances a doddle. He understands natural language, so no need to learn a new CLI!”


Litlbot – “Litlbot is a Slack bot that enables realtime interaction with students in class, creating a more engaged classroom and learning experience.” GitHub repo.


Marbot – “Forward alerts from Amazon Web Services to your DevOps team.”


Opsidian – “Collaborate on your AWS infra from Slack using natural language.”


ServiceBot – “Communication platform between humans, machines, and enterprises.” GitHub repo.


Larger Organization
And here's the winner of the Serverless Slackbot Large Organization Award:


Eva – “The virtual travel assistant for your team.” GitHub repo.


Thanks & Congratulations
I would like to personally thank each of the entrants for taking the time to submit their entries to the competition!


Congratulations to all of the winners; I hope to see you all at AWS re:Invent.

-
Jeff;

 


PS – If this list has given you an idea for a chatbot of your very own, please watch our Building Serverless Chatbots video and take advantage of our Serverless Chatbot Sample.



Thursday, October 20, 2016

Apple Complains about Fake Products Being Sold on the Amazon Site

The two big name brands are against each other as Apple claims that 90% of the “genuine” charging cables and own-brand power adapters being sold on the Amazon site are allegedly fake, after purchasing a number of these “power products” and testing it out. Amazon confirms that these products are from Mobile Star.  … Continue reading Apple Complains about Fake Products Being Sold on the Amazon Site

AWS Developer Tool Recap – Recent Enhancements to CodeCommit, CodePipeline, and CodeDeploy

The AWS Developer Tools help you to put modern DevOps practices to work! Here's a quick overview (read New AWS Tools for Code Management and Deployment for an in-depth look):


AWS CodeCommit is a fully-managed source code control service. You can use it to host secure and highly scalable private Git repositories while continuing to use your existing Git tools and workflows (watch the Introduction to AWS CodeCommit video to learn more).


AWS CodeDeploy automates code deployment to Amazon Elastic Compute Cloud (EC2) instances and on-premises servers. You can update your application at a rapid clip, while avoiding downtime during deployment (watch the Introduction to AWS CodeDeploy video to learn more).


AWS CodePipeline is a continuous delivery service that you can use to streamline and automate your release process. Checkins to your repo (CodeCommit or Git) will initiate build, test, and deployment actions (watch Introducing AWS CodePipeline for an introduction). The build can be deployed to your EC2 instances or on-premises servers via CodeDeploy, AWS Elastic Beanstalk, or AWS OpsWorks.


You can combine these services with your existing build and testing tools to create an end-to-end software release pipeline, all orchestrated by CodePipeline.


We have made a lot of enhancements to the Code* products this year and today seems like a good time to recap all of them for you! Many of these enhancements allow you to connect the developer tools to other parts of AWS so that you can continue to fine-tune your development process.


CodeCommit Enhancements
Here's what's new with CodeCommit:



  • Repository Triggers

  • Code Browsing

  • Commit History

  • Commit Visualization

  • Elastic Beanstalk Integration


Repository Triggers – You can create Repository Triggers that Send Notification or Run Code whenever a change occurs in a CodeCommit repository (these are sometimes called webhooks - user-defined HTTP callbacks). These hooks will allow you to customize and automate your development workflow. Notifications can be delivered to an Amazon Simple Notification Service (SNS) topic or can invoke a Lambda function.



Code Browsing – You can Browse Your Code in the Console. This includes navigation through the source code tree and the code:



Commit History – You can View the Commit History for your repositories (mine is kind of quiet, hence the 2015-era dates):



Commit Visualization – You can View a Graphical Representation of the Commit History for your repositories:



Elastic Beanstalk Integration – You can Use CodeCommit Repositories with Elastic Beanstalk to store your project code for deployment to an Elastic Beanstalk environment.


CodeDeploy Enhancements
Here's what's new with CodeDeploy:



  • CloudWatch Events Integration

  • CloudWatch Alarms and Automatic Deployment Rollback

  • Push Notifications

  • New Partner Integrations


CloudWatch Events Integration – You can Monitor and React to Deployment Changes with Amazon CloudWatch Events by configuring CloudWatch Events to stream changes in the state of your instances or deployments to an AWS Lambda function, an Amazon Kinesis stream, an Amazon Simple Queue Service (SQS) queue, or an SNS topic. You can build workflows and processes that are triggered by your changes. You could automatically terminate EC2 instances when a deployment fails or you could invoke a Lambda function that posts a message to a Slack channel.



CloudWatch Alarms and Automatic Deployment Rollback – CloudWatch Alarms give you another type of Monitoring for your Deployments. You can monitor metrics for the instances or Auto Scaling Groups managed by CodeDeploy and take action if they cross a threshold for a defined period of time, stop a deployment, or change the state of an instance by rebooting, terminating, or recovering it. You can also automatically rollback a deployment in response to a deployment failure or a CloudWatch Alarm.



Push Notifications – You can Receive Push Notifications via Amazon SNS for events related to your deployments and use them to track the state and progress of your deployment.



New Partner Integrations – Our CodeDeploy Partners have been hard at work, connecting their products to ours. Here are some of the most recent offerings:



CodePipeline Enhancements
And here's what's new with CodePipeline:



  • AWS OpsWorks Integration

  • Triggering of Lambda Functions

  • Manual Approval Actions

  • Information about Committed Changes

  • New Partner Integrations


AWS OpsWorks Integration – You can Choose AWS OpsWorks as a Deployment Provider in the software release pipelines that you model in CodePipeline:



You can also configure CodePipeline to use OpsWorks to deploy your code using recipes contained in custom Chef cookbooks.


Triggering of Lambda Functions – You can now Trigger a Lambda Function as one of the actions in a stage of your software release pipeline. Because Lambda allows you to write functions to perform almost any task, you can customize the way your pipeline works:



Manual Approval Actions – You can now add Manual Approval Actions to your software release pipeline. Execution pauses until the code change is approved or rejected by someone with the required IAM permission:



Information about Committed Changes – You can now View Information About Committed Changes to the code flowing through your software release pipeline:



 


New Partner Integrations – Our CodePipeline Partners have been hard at work, connecting their products to ours. Here are some of the most recent offerings:



New Online Content
In order to help you and your colleagues to understand the newest development methodologies, we have created some new introductory material:



Thanks for Reading!
I hope that you have enjoyed this quick look at some of the most recent additions to our development tools.


In order to help you to get some hands-on experience with continuous delivery, my colleagues have created a new Pipeline Starter Kit. The kit includes a AWS CloudFormation template that will create a VPC with two EC2 instances inside, a pair of applications (one for each EC2 instance, both deployed via CodeDeploy), and a pipeline that builds and then deploys the sample application, along with all of the necessary IAM service and instance roles.

-
Jeff;


Tuesday, October 18, 2016

Amazon Aurora Update – Call Lambda Functions From Stored Procedures; Load Data From S3

Many AWS services work just fine by themselves, but even better together! This important aspect of our model allows you to select a single service, learn about it, get some experience with it, and then extend your span to other related services over time. On the other hand, opportunities to make the services work together are ever-present, and we have a number of them on our customer-driven roadmap.


Today I would like to tell you about two new features for Amazon Aurora, our MySQL-compatible relational database:


Lambda Function Invocation – The stored procedures that you create within your Amazon Aurora databases can now invoke AWS Lambda functions.


Load Data From S3 – You can now import data stored in an Amazon Simple Storage Service (S3) bucket into a table in an Amazon Aurora database.


Because both of these features involve Amazon Aurora and another AWS service, you must grant Amazon Aurora permission to access the service by creating an IAM Policy and an IAM Role, and then attaching the Role to your Amazon Aurora database cluster. To learn how to do this, see Authorizing Amazon Aurora to Access Other AWS Services On Your Behalf.


Lambda Function Integration
Relational databases use a combination of triggers and stored procedures to enable the implementation of higher-level functionality. The triggers are activated before or after some operations of interest are performed on a particular database table. For example, because Amazon Aurora is compatible with MySQL, it supports triggers on the INSERT, UPDATE, and DELETE operations. Stored procedures are scripts that can be run in response to the activation of a trigger.


You can now write stored procedures that invoke Lambda functions. This new extensibility mechanism allows you to wire your Aurora-based database to other AWS services. You can send email using Amazon Simple Email Service (SES), issue a notification using Amazon Simple Notification Service (SNS), insert publish metrics to Amazon CloudWatch, update a Amazon DynamoDB table, and more.


At the appliction level, you can implement complex ETL jobs and workflows, track and audit actions on database tables, and perform advanced performance monitoring and analysis.


Your stored procedure must call the mysql_lambda_async procedure. This procedure, as the name implies, invokes your desired Lambda function asynchronously, and does not wait for it to complete before proceeding. As usual, you will need to give your Lambda function permission to access any desired AWS services or resources.


To learn more, read Invoking a Lambda Function from an Amazon Aurora DB Cluster.


Load Data From S3
As another form of integration, data stored in an S3 bucket can now be imported directly in to Aurora (up until now you would have had to copy the data to an EC2 instance and import it from there).


The data can be located in any AWS region that is accessible from your Amazon Aurora cluster and can be in text or XML form.


To import data in text form, use the new LOAD DATA FROM S3 command. This command accepts many of the same options as MySQL's LOAD DATA INFILE, but does not support compressed data. You can specify the line and field delimiters and the character set, and you can ignore any desired number of lines or rows at the start of the data.


To import data in XML form,  use the new LOAD XML from S3 command. Your XML can look like this:



...


Or like this:



value1
value2

...

Or like this:



value1
value2

...

To learn more, read Loading Data Into a DB Cluster From Text Files in an Amazon S3 Bucket.


Available Now
These new features are available now and you can start using them today!


There is no charge for either feature; you'll pay the usual charges for the use of Amazon Aurora, Lambda, and S3.

-
Jeff;