When it comes to the cloud, Google certainly isn't taking a summer holiday. Over the past weeks there have been a string of cloud related developments from Google showing that is very focused, delivering innovative services and perhaps narrowing the considerable market share gap between itself and rivals IBM, Microsoft Azure and Amazon Web Services. There is a new Google cloud data centre in London, a new data transfer service, a new transfer appliance and a new offering for computational drug discovery. And this week came word from Bloomberg that Google is gearing up to launch its first quantum computing cloud services. While the company declined to comment directly about the Bloomberg story it is understood that quantum computing is an area of keen interest for Google.
New London data centre
Customers of Google Cloud Platform (GCP) can use the new region in London (europe-west2) to run applications. Google noted that London is its tenth region, joining the existing European region in Belgium. Future European regions include Frankfurt, the Netherlands and Finland. Google also stated that it is working diligently to address EU data protection requirements. Most recently, Google announced a commitment to GDPR compliance across GCP.
Introducing Google Transfer Appliance
This is a pre-configured solution that offers up to 480TB in 4U or 100TB in 2U of raw data capacity in a single rackmount device. Essentially, it is high-capacity storage server that a customer can install in a corporate data centre. Once the server is full, the customer simply ships the appliance back to Google for transferring the data to Google Cloud Storage. It offers a capacity of up to one-petabyte compressed.
The Google Transfer Appliance is a very practical solution even when massive bandwidth connections are available at both ends. For instance, for customers fortunate enough to possess a 10 Gbit/s connection, a 100TB data store would still take 30 hours to transfer electronically. A 1PB data library would take over 12 days using the same10 Gbit/s connection, and that is assuming no drops in connectivity performance. Google is now offering a 100TB model priced at $300, plus shipping via FedEx (approximately $500) and a 480TB model is priced at $1800, plus shipping (approximately $900). Amazon offers a similar Snowball Edge data migration appliance for migrating large volumes of data to its cloud the old-fashioned way.
Partnership for computational medicine
Under a partnership with Boston -based Silicon Therapeutics, Google recently deployed its INSITE Screening platform on Google Cloud Platform (GCP) to analyse over 10 million commercially available molecular compounds as potential starting materials for next-generation medicines. In one week, it performed over 500 million docking computations to evaluate how a protein responds to a given molecule. Each computation involved a docking program that predicted the preferred orientation of a small molecule to a protein and the associated energetics so it could assess whether it will bind and alter the function of the target protein.
With a combination of Google Compute Engine standard and Preemptible VMs, the partners used up to 16,000 cores, for a total of 3 million core-hours and a cost of about $30,000. Google noted that a final stage of the calculations delivered all-atom molecular dynamics (MD) simulations on the top 1,000 molecules to determine which ones to purchase and experimentally assay for activity.
Pushing ahead with Kubernetes
The recent open source release of Kubernetes 1.7 is now available on Container Engine, Google Cloud Platform’s (GCP) managed container service. The end result is better workload isolation within a cluster, which is a frequently requested security feature in Kubernetes. Google also announced that its Container Engine, which saw more than 10x growth last year, is now available from the following GCP regions:
• Sydney (australia-southeast1).
• Singapore (asia-southeast1).
• Oregon (us-west1).
• London (europe-west2).
Container engine clusters are already up and running at locations from Iowa to Belgium and Taiwan.
New strategic partnership with Nutanix
Google has formed a strategic partnership with Nutanix to help remove friction from hybrid cloud deployments for enterprises.
Reimagining virtual public clouds at global scale
Integrating cloud resources from different areas of the world no longer requires negotiating and installing a VPN solution from one or more service providers. Google can do it for you using its own global backbone. VPC is private, and with Google VPC customers can get private access to Google services such as storage, big data, analytics or machine learning, without having to give the service a public IP address. Global VPCs are divided into regional subnets that use Google’s private backbone to communicate as needed.
VPC, formerly known as GCP Virtual Networks, offers a privately administered space within Google Cloud Platform (GCP). This means global connectivity across locations and regions, and the elimination of silos across projects and teams.
Further information on Google Cloud Platform is available at the blog here:
: