Tuesday, March 31, 2015

AWS Console Mobile App Update – Support for AWS CloudFormation & Auto Scaling Group Edits

We have made two additions to the AWS Console mobile app (available on Amazon Appstore, Google Play, and iTunes). You can view the status of your stacks and you can edit the min, max, and desired capacities of your Auto Scaling groups. Here’s the main screen of the newly revised console: CloudFormation Features You can […]

European Union Data Protection Authorities Approve Amazon Web Services’ Data Processing Agreement

As you all know security, privacy, and protection of our customer’s data is our number one priority and as such we work very closely with regulators to ensure that customers can be assured that they are getting the right protections when processing and storing data in the AWS. I am especially pleased that the group of European Union (EU) data protection authorities known as the Article 29 Working Party has approved the AWS Data Processing Agreement (DPA), assuring customers that it meets the high standards of EU data protection laws. The media alert below that went out today gives the details:

European Union Data Protection Authorities Approve Amazon Web Services’ Data Processing Agreement

Customers All Over the World Are Assured that AWS Agreement Meets Rigorous EU Privacy Laws

Brussels – March 31, 2015 – Amazon Web Services (AWS) today announced that the group of European Union (EU) data protection authorities known as the Article 29 Working Party has approved the AWS Data Processing Agreement (DPA), assuring customers that it meets the high standards of EU data protection laws. The approval of the AWS DPA, which embodies the Standard Contractual Clauses (often referred to as Model Clauses), means that AWS customers wishing to transfer personal data from the European Economic Area (EEA) to other countries can do so with even more knowledge that their content on AWS will be given the same high level of protection it receives in the EEA. For more detail on the approval from the Article 29 Working Party, visit the Luxembourg Data Protection Authority webpage here: http://www.cnpd.public.lu/en/actualites/international/2015/03/AWS/index.html

The AWS cloud is already being used extensively across the EU by startups, government agencies, educational institutions and leading enterprises such as Réseau Ferré de France and Veolia, in France, St James’s Place and Shell in the UK and Talanx and Hubert Burda Media in Germany. AWS customers have always had the freedom to choose the location where they store and process their content with the assurance that AWS will not move it from their chosen region. Customers have access to 11 AWS regions around the globe, including two in the EU – Ireland (Dublin) and Germany (Frankfurt) – which are comprised of multiple Availability Zones for customers to build highly secure and available applications. The DPA with Model Clauses gives AWS customers more choice when it comes to data protection and assures them that their content receives the same high levels of data protection, in accordance with European laws, no matter which AWS infrastructure region they choose around the world. The DPA is now available on request to all customers that require it.

“The security, privacy, and protection of our customer’s data is our number one priority,” said Dr Werner Vogels, Chief Technology Officer, Amazon.com. “Providing customers a DPA that has been approved by the EU data protection authorities is another way in which we are giving them assurances that they will receive the highest levels of data protection from AWS. We have spent a lot of time building tools, like security controls and encryption, to give customers the ability to protect their infrastructure and content. We will always strive to provide the highest level of data security for AWS customers in the EU and around the world.”

In the letter issued to AWS, the Article 29 Working Party said, “The EU Data Protection Authorities have analysed the arrangement proposed by Amazon Web Services” and “have concluded that the revised Data Processing Addendum is in line with Standard Contractual Clause 2010/87/EU and should not be considered as ‘ad-hoc’ clauses.” This means customers can sign the AWS Data Processing Addendum with Model Clauses without the need for authorization from data protection authorities, as would be necessary for contract clauses intended to address EU privacy rules that have not been approved, known as “ad hoc clauses.”

As well as having a DPA that has been approved by the Article 29 Working Party, AWS is fully compliant with all applicable EU data protection laws and maintains robust global security standards, such as ISO 27001, SOC 1, 2, 3 and PCI DSS Level 1. In 2013, the AWS Cloud was approved by De Nederlandsche Bank for use in the Dutch financial services sector, opening the door for financial services firms in The Netherlands to store confidential data and run mission-critical applications on AWS. AWS has teams of Solutions Architects, Account Managers, Trainers and other staff in the EU expertly trained on cloud security and compliance to assist AWS customers as they move their applications to the cloud. AWS also helps customers meet local security standards and has launched a Customer Certification Workbook, developed by independent certification body TÜV TRUST IT, providing customers with guidance on how to become certified for BSI IT Grundschutz in Germany. A copy of the workbook can be found at: http://aws.amazon.com/compliance/

“The EU has the highest data protection standards in the world and it is very important that European citizens' data is protected,” said Antanas Guoga, Member of the European Parliament. “I believe that the Article 29 Working Party decision to approve the data proceeding agreement put forward by Amazon Web Services is a step forward to the right direction. I am pleased to see that AWS puts an emphasis on the protection of European customer data. I hope this decision will also help to drive further innovation in the cloud computing sector across the EU.”

“For us, like many companies, data privacy is paramount,” said JP Schmetz, Chief Scientist at Hubert Burda Media. “One of the reasons we chose AWS is the fact that they put so much emphasis on maintaining the highest levels of security and privacy for all of their customers. This is why we are moving mission critical workloads to AWS.”

For more information on AWS Model Clauses please visit: http://aws.amazon.com/compliance/eu-data-protection More information on AWS’ data protection practices can be found on the AWS Data Protection webpage at: http://aws.amazon.com/compliance/data-privacy-faq/. A full list of compliance certifications and a list of the robust controls in place at AWS to maintain security and data protection for customers can be found on the AWS compliance webpage at: http://aws.amazon.com/compliance/.

The Next Generation of Dense-storage Instances for EC2

Perhaps you, like many other users, store and process huge amounts of data in the cloud. Today we are announcing a new generation of Dense-storage instances that will provide you additional options for processing multi-terabyte data sets. New D2 Instances The new D2 instances are designed to provide you with additional compute power and memory […]

Monday, March 30, 2015

AWS Week in Review – March 23, 2015

`Let’s take a quick look at what happened in AWS-land last week: Monday, March 23 A post on the provided a helpful pointer to some new documentation on Amazon S3 Client-side Crypto Meta Information. The disclosed the availability of a New Security and Compliance Workblook: IT-Grundschultz. Tuesday, March 24 We announced Cross-Region Replication for Amazon […]

Tuesday, March 24, 2015

Now Available – AWS Marketplace in the Frankfurt Region

makes it easy for you to find, buy, and quickly start using a wide variety of software products from developers located all over the world: Open in Germany Today we are making AWS Marketplace available to users of our new region. If you are using this region you can make use of over 700 products […]

Monday, March 23, 2015

Congrats to the First Six AWS Managed Service Program Providers

We announced the AWS Managed Service Partner program at last year’s . We created the program in order to help our customers to find Partners who can deliver managed services in the cloud. In order for an APN Partner to become an approved member of this program, the quality of their offering must meet a […]

AWS Week in Review – March 16, 2015

Let’s take a quick look at what happened in AWS-land last week: Monday, March 16 We announced that the EC2 Container Service is now Available in the EU (Ireland) Region. We published the Second Release Candidate of the Amazon Linux AMI. We updated the AWS SDK for Unity. Tuesday, March 17 Paul Xue took on […]

Thursday, March 19, 2015

Now Available: 16 TB and 20,000 IOPS Elastic Block Store (EBS) Volumes

Last year I told you about Larger and Faster EBS Volumes and asked you to stay tuned for availability. Starting today you can create Provisioned IOPS (SSD) volumes that store up to 16 TB (terabytes) and process up to 20,000 IOPS, with a maximum throughput of 320 MBps (megabytes per second). You can also create […]

Thursday, March 12, 2015

New – AWS API Activity Lookup in CloudTrail

My colleague Sivakanth Mundru sent another guest post my way! This one shows you how to look up AWS API activity using . — Jeff; “The ability to look up API activity in AWS CloudTrail helps us easily troubleshoot security incidents and operational issues. We can now take immediate actions such as following up with […]

Wednesday, March 11, 2015

Amazon ElastiCache Update – Redis 2.8.19 Now Available

You can use to easily create, scale, and maintain cloud-based in-memory key-value stores that use the Memcached or Redis engines. Today we are making ElastiCache even more useful by adding support for version 2.8.19 of the Redis engine. Compared to version 2.8.6 (until now the latest version supported by ElastiCache Redis), this version of Redis […]

Observations on the Importance of Cloud-based Analytics

Cloud computing is enabling amazing new innovations both in consumer and enterprise products, as it became the new normal for organizations of all sizes. So many exciting new areas are being empowered by cloud that it is fascinating to watch. AWS is enabling innovations in areas such as healthcare, automotive, life sciences, retail, media, energy, robotics that it is mind boggling and humbling.

Despite all of the amazing innovations we have already seen, we are still on Day One in the Cloud; at AWS we will continue to use our inventive powers to build new tools and services to enable even more exciting innovations by our customers that will touch every area of our lives. Many of these innovations will have a significant analytics component or may even be completely driven by it. For example many of the Internet of Things innovations that we have seen come to life in the past years on AWS all have a significant analytics components to it.

I have seen our customers do so many radical new things with the analytics tools that our partners and us make available that I have made a few observations I would like to share with you.

Cloud analytics are everywhere. There is almost no consumer or business area that is not impacted by Cloud enabled analytics. Often it is hidden from the consumer’s eye as it empowers applications rather than being the end game but analytics is becoming more prevalent. From retail recommendations to genomics based product development, from financial risk management to start-ups measuring the effect of their new products, from digital marketing to fast processing of clinical trial data, all are taken to the next level by cloud based analytics.

For AWS we have seen evidence of this as Amazon Redshift, our data warehouse service, has become the fastest growing Cloud service in the history of the company. We even see that for many businesses Amazon Redshift is the first cloud service they ever use. Adoption is now really starting to explode in 2015 as more and more businesses understand the power analytics has to empower their organizations. The integration with many of the standard analytics tools such as Tableau, Jaspersoft, Pentaho and many others make Redshift extremely powerful.

Cloud enables self-service analytics. In the past analytics within an organization was the pinnacle of old style IT: a centralized data warehouse running on specialized hardware. In the modern enterprise this scenario is not acceptable. Analytics plays a crucial role in helping business units become more agile and move faster to respond to the needs of the business and build products customers really want. But they are still bogged down by this centralized, oversubscribed, old style data warehouse model. Cloud based analytics change this completely.

A business unit can now go out and create their own data warehouse in the cloud of a size and speed that exactly matches what they need and are willing to pay for. It can be a small, 2 node, data warehouse that runs during the day, a big 1000 node data warehouse that just runs for a few hours on a Thursday afternoon, or one that runs during the night to give personnel the data they need when they come into work in the morning.

A great example of this is the work global business publication The Financial Times (FT) is doing with analytics. The FT is over 120 years old and has transformed how it has been using the cloud to run Business Intelligence (BI) workloads to completely revolutionize how they offer content to customers, giving them the ability to run analytics on all their stories, personalizing the paper, giving readers a more tailored reading experience. With the new BI system the company is able to run analytics across 140 stories per day, in real time, and increase their agility for completing analytics tasks from months to days. As part of this the FT has also expanded their BI to better target advertising to readers. By using Amazon Redshift they are able to process 120m unique events per day and integrate their internal logs with external data sources, which is helping to create a more dynamic paper for their readers. All of this while cutting their datawarehouse cost by 80%.

Cloud Analytics will enable everything to become smart. These days everything has the ability to become “smart” - a smart watch, smart clothes, a smart TV, a smart home, a smart car. However, in almost all cases this “smartness” runs in software in the cloud not the object or the device itself.

Whether it is the thermostat in your home, the activity tracker on your wrist, or the smart movie recommendations on your beautiful ultra HD TV, all are powered by analytics engines running in the cloud. As all the intelligence of these smart products live in the cloud it is spawning a new generation of devices. A good example here is the work Philips is doing to make street lighting smart with their CityTouch product.

Philips CityTouch is an intelligent light management system for city-wide street lighting. It offers connected street lighting solutions that allow entire suburbs and cities to actively control street lighting to manage the after dark environment in real time. This allows local councils to keep certain streets well lit, to accommodate high foot traffic, bring on lighting during adverse weather, when ambient light dims to a dangerous level, or even turn lighting down, for example in an industrial estate, where there are no people. This technology is already being used in places like Prague and in suburbs of London. CityTouch is using the cloud as the backend technology to run the system and extract business value from large amounts of data collected from sensors installed in the street lights. This data is allowing councils to better understand their cities after dark and employ more efficient light management programmes and avoid too much light pollution which can have an adverse effect on residents and wildlife around cities.

Cloud Analytics improves city life. Related to the above is the ability for cloud analytics to take information from the city environment to improve the living conditions for citizens around the world. A good example is the work the Urban Center for Computation and Data of the City of Chicago is doing. The City of Chicago is one of the first to bring sensors throughout the city that will permanently measure air quality, light intensity, sound volume, heat, precipitation, wind and traffic. The data from these sensor stream into the cloud where it is analyzed to find ways to improve the life of its citizens. The collected datasets from Chicago’s “Array of Things” will be made publically available on the cloud for researchers to find innovate ways to analyze the data.

Many cities have already expressed interest in following Chicago’s lead to use the cloud to improve city life and many are beginning to do the same in Europe such as the Peterborough City Council in the UK. Peterborough City Council is making public data sets available to outsource innovation to the local community. The different data sets from the council are being mashed together where people are mapping, for example, crime data against weather patterns to help the council understand if there are there more burglaries when it is hot and how they should resource the local police force. Or mapping hospital admission data against weather to identify trends and patterns. This data is being made open and available to everyone to drive innovation, thanks to the power of the cloud.

Cloud Analytics enable the Industrial Internet of Things. Often when we think about the Internet of Things (IoT) we focus on what this will mean for the consumer. But we are already seeing the rise of a different IoT - the Industrial Internet of Things. Industrial machinery is instrumented and Internet connected to stream data into the cloud to gain usage insights, improve efficiencies and prevent outages.

Whether this is General Electric instrumenting their gas turbines, Shell dropping sensors in their oil wells, Kärcher with fleets of industrial cleaning machines, or construction sites enabled with sensors from Deconstruction, all of these send continuous data streams for real time analysis into the cloud.

Cloud enables video analytics. For a long time video was recorded to be archived, played back and watched. With the unlimited processing power of the cloud there is a new trend arising: treating video as a data stream to be analyzed. This is being called Video Content Analysis (VCA) and it has many application areas from retail to transportation.

A common area of application is in locations where video cameras are present such as malls and large retail stores. Video is analyzed to help stores understand traffic patterns. Analytics provide the numbers of customers moving as well as dwell times, and other statistics. This allows retailers to improve their store layouts and in-store marketing effectiveness.

Another popular area is that of real time crowd analysis at large events, such as concerts, to understand movement throughout the venue and remove bottlenecks before they occur in order to improve visitor experience. Similar applications are used by transport departments to regulate traffic, detect stalled cars on highways, detect objects on high speed railways, and other transport issues.

Another innovative examples that has taken VCA into the consumer domain is Dropcam. Dropcam analyzes video streamed by Internet enabled video cameras to provide their customers with alerts. Dropcam is currently the largest video producer on the Internet, ingesting more video data into the cloud than YouTube.

VCA is also becoming a crucial tool in sports management. Teams are using video analysis to process many different angles on the players. For example the many video streams recorded during a Premier League match are used by teams to improve player performance and drive specific training schemes.

In the US video analytics is being used by MLB baseball teams to provide augmented real time analytics on video screens around the stadium while the NFL is using VCA to create automatic condensed versions of American football matches bringing the run time down by 60%-70%.

Cloud transforms health care analytics. Data analytics is quickly becoming central to analyzing health risk factors and improving patient care. Despite healthcare being an area that is under pressure to reduce cost and speed up patient care, cloud is playing a crucial role and helping healthcare go digital.

Cloud powers innovative solutions such as Phillips Healthsuite, a platform that manages healthcare data and provides support for doctors as well as patients. The Philips HealthSuite digital platform analyzes and stores 15 PB of patient data gathered from 390 million imaging studies, medical records, and patient inputs to provide healthcare providers with actionable data, which they can use to directly impact patient care. This is reinventing healthcare for billions of people around the world. As we move through 2015 and beyond we can expect to see cloud play even more of a role in the advancement of the field of patient diagnosis and care.

Cloud enables secure analytics. With analytics enabling so many new areas, from online shopping to healthcare to home automation, it becomes paramount that the analytics data is kept secure and private. The deep integration of encryption into the storage and in the analytics engines, with users being able to bring their own encryption keys, ensures that only the users of these services have access to the data and no one else.

In Amazon Redshift data blocks, system metadata, partial results from queries and backups are encrypted with a random generated key, then this is set of keys is encrypted with a master key. This encryption is standard operation practice; customers do not need to do anything. If our customers want full control over who can access their data they can make use of their own Master key to encrypt the data block keys. Customer can make use of the AWS Key Management Service to securely manage their own keys that are stored in Hardware Security Modules to ensure that only the customer has access to the keys and that only the customer controls who has access to their data.

Cloud enables collaborative research analytics. As Jim Gray already predicted in his 4th paradigm much of the research word is shifting from computational models to data-driven sciences. We already see this by many researchers making their datasets available for collaborative real-time analytics in the cloud. Whether these data sets come streamed from Mars or from the bottom of the Oceans, the cloud is the place to ingest, store, organize, analyze and share this data.

An interesting commercial example are the connected sequence systems from Illumina; the sequenced data is directly streamed to the cloud where the customer has access to BaseSpace, a cloud based market place for algorithms that can be used to process their data.

At AWS we are proud to power the progress that puts analytic tools in the hands of everyone. We humbled by what our customers are already doing with our current toolset. But it is still Day One; we will continue to innovate in this space such that our customers can go on to do even greater things.

Monday, March 9, 2015

New AWS Webinar Series – Introductions, Best Practices, and More

We have set up a new AWS Webinar Series. In March you can attend a webinar and Get started with AWS or a specific AWS Service (OpsWorks or WorkSpaces), master an AWS service (EC2), learn AWS best practices (content delivery, availability, or security), deploy enterprise workloads (SAP), or optimize for cost and performance (RDS). The […]

AWS Week in Review – March 2, 2015

Let’s take a quick look at what happened in AWS-land last week: Monday, March 2 We announced Amazon EMR Support for New Instance Families and a Price Reduction in the AWS GovCloud (US) region. The N2W Blog showed you How-to Backup Your AWS Cloud Based PostgreSQL Database. The Cloud Academy Blog talked about Windows Server […]

Thursday, March 5, 2015

AWS Management Portal for vCenter Update – Auto Upgrades, Log Upload, Queued Imports

We have updated . This plug-in runs within your existing vCenter environment and gives you the power to migrate VMware VMs to and to manage AWS resources from within vCenter. Today’s update includes automatic upgrades, log uploading, and queued import tasks. Automatic Upgrades The management console now displays a prompt when a new version is […]

CloudTrail Integration with CloudWatch in Four More Regions

My colleague Sivakanth Mundru sent me a guest post with and integration news, along with information about a new template to help you to get going more quickly. — Jeff; At re: Invent 2014, we launched integration with Logs in the , , and regions. With this integration, you can monitor for specific API calls […]

Wednesday, March 4, 2015

Save the Date – AWS re:Invent 2015 is Coming Soon

I am already looking forward to my fourth re:Invent! This year’s conference will take place from October 6 to 9 in Las Vegas. We’ll be announcing more details and opening up the registration system in May. You can sign up here in order to receive email updates. We are already working on the services, presentations, […]