My colleague Loren Segal shared the blog post below. As you can see, he’s really excited about the new AWS SDK for Go!
— Jeff;
Today we are announcing the first Developer Preview Release of the AWS SDK for Go (v0.6.0). If you have not been following along with development, the AWS SDK for Go now has full AWS service support, exponential backoff retry handling, and more. Since our last announcement, we’ve also added a concurrent streaming upload and download manager for Amazon S3, and built-in support for response pagination, with resource waiter support in the works. We have also added a Getting Started Guide, which we will continue to update as we add new features to the SDK.
This release also comes with a few organizational and process changes. Firstly, we have moved our GitHub repository from the awslabs organization to aws/aws-sdk-go. This signifies that this SDK is no longer in an experimental state. Please make sure to update your “import” statements to reflect this new location accordingly (see the example below for the full statement). Secondly, being in Developer Preview mode means that we will now be providing releases for new services and service updates as they are made available, which should allow users to keep up-to-date with all of our new AWS functionality.
We’re excited to get this SDK in your hands and to hear what you think about the changes and new features we’ve been working on. The goal of our Developer Preview cycle is to get feedback about what works and what does not so that we can make tweaks to our API before locking it down for a stable 1.0 release, so don’t be shy about letting us know what you do and don’t like on our GitHub Issues page!
To end this post, I thought I’d share a small example code snippet using response pagination and our Amazon S3 download manager to pull down a series of objects in S3 to your local file system. For simplicity, each file is downloaded sequentially, though it would be possible to wrap these operations in goroutines and have them download concurrently as well. Note that, while using the download manager, each individual download operation will retrieve data concurrently in chunks.
package main
import (
"fmt"
"os"
"path/filepath"
"github.com/aws/aws-sdk-go/service/s3"
"github.com/aws/aws-sdk-go/service/s3/s3manager"
)
var (
Bucket = "MyBucket" // Download from this bucket
Prefix = "logs/" // Using this key prefix
LocalDirectory = "s3logs" // Into this directory
)
func main() {
manager := s3manager.NewDownloader(nil)
d := downloader{bucket: Bucket, dir: LocalDirectory, Downloader: manager}
client := s3.New(nil)
params := &s3.ListObjectsInput{Bucket: &Bucket, Prefix: &Prefix}
client.ListObjectsPages(params, d.eachPage)
}
type downloader struct {
*s3manager.Downloader
bucket, dir string
}
func (d *downloader) eachPage(page *s3.ListObjectsOutput, more bool) bool {
for _, obj := range page.Contents {
d.downloadToFile(*obj.Key)
}
return true
}
func (d *downloader) downloadToFile(key string) {
// Create the directories in the path
file := filepath.Join(d.dir, key)
if err := os.MkdirAll(filepath.Dir(file), 0775); err != nil {
panic(err)
}
// Setup the local file
fd, err := os.Create(file)
if err != nil {
panic(err)
}
// Download the file using the AWS SDK
fmt.Printf("Downloading s3://%s/%s to %s...\n", d.bucket, key, file)
params := &s3.GetObjectInput{Bucket: &d.bucket, Key: &key}
d.Download(fd, params)
}
— Loren Segal, Software Development Engineer
No comments:
Post a Comment