Cloud
AWS – Shared account customization now available for AWS Control Tower Account Factory for Terraform
AWS Control Tower now provides you with the ability to manage and customize your shared and management accounts with Account Factory for Terraform (AFT). You can now centralize your account customization management and increase governance coverage of your AWS Control Tower environment while still protecting the security of your account configurations. Shared account customization assists customers that want the ability to use the same mechanism for customization across all of their accounts. AFT has also made a role change to help you better manage the permissions of your customizations. You will now be able to fully automate your permission management for AFT to act on all of your accounts with any level of granularity.
Read More for the details.
AWS – Amazon Connect now supports custom templates for agent tasks
Amazon Connect now allows you to create custom task templates, making it easy for agents to capture the right information to create and complete tasks. Amazon Connect Tasks empowers you to prioritize, assign, and track all contact center agent tasks to completion, improving agent productivity and ensuring customer issues are quickly resolved. You can easily compose templates for a variety of scenarios such as such as investigating billing issues or new insurance claims, allowing agents to choose the template that best suits the situation. For example, when handling a billing inquiry, a task template can pre-populate data and guide agents to gather additional information needed to quickly resolve the issue. Task templates are supported out-of-the box without the need for manual configuration for the Amazon Connect agent application. To learn more, see the API reference guide, help documentation, or visit our webpage.
Read More for the details.
AWS – Amazon Connect launches an API to programmatically transfer tasks
Amazon Connect now provides a TransferContact API to programmatically transfer tasks to another flow, to an agent queue, or to a shared queue for distribution to an available agent. Amazon Connect Tasks empowers contact center managers to prioritize, assign, track, and automate customer service tasks across the disparate applications used by agents. Using this API, contact center managers can now directly transfer tasks from their custom analytics dashboards for time resolution. You can already dynamically prioritize and assign tasks using Connect Flows based on agent skill set, availability, and information about the task (e.g., type, priority/urgency, category).
Read More for the details.
AWS – Announcing AWS PrivateLink support for AWS Panorama
AWS Panorama customers can now use AWS PrivateLink to access AWS Panorama from their Amazon Virtual Private Cloud (Amazon VPC) without using public endpoints, and without requiring the traffic to traverse the Internet. Using AWS PrivateLink, you can access AWS Panorama endpoints easily and securely by keeping your traffic within the AWS network, while simplifying your internal network architecture. You no longer need to use an internet gateway, Network Address Translation (NAT) devices, or firewall proxies to connect to AWS Panorama.
Read More for the details.
AWS – Amazon Chime SDK now supports centralized attendee controls
The Amazon Chime SDK now lets developers centrally control each participant’s ability to send and receive audio, video, and screen share within a WebRTC session. Amazon Chime SDK lets developers add intelligent real-time audio, video, and screen share to their web and mobile applications. Enforcement of attendee capabilities is centralized in the WebRTC session, so developers do not have to rely on logic within client applications, which may be outdated in older versions.
Read More for the details.
AWS – Amazon Kendra releases GitHub OnPrem connectors
Amazon Kendra is an intelligent search service powered by machine learning, enabling organizations to provide relevant information to customers and employees, when they need it. Starting today, AWS customers can use the Amazon Kendra Github OnPrem connector to index and search documents from GitHub Enterprise Server data source.
Read More for the details.
AWS – Amazon Kendra releases GitHub SaaS Connector
Amazon Kendra is an intelligent search service powered by machine learning, enabling organizations to provide relevant information to customers and employees, when they need it. Starting today, AWS customers can use the Amazon Kendra Github SaaS connector to index and search documents from GitHub Enterprise Cloud data source.
Read More for the details.
Azure – Public preview: Azure Percept DK May (2205) software update
The Azure Percept March update includes fixes related to security.
Read More for the details.
AWS – SageMaker JumpStart now supports automatic tuning
Amazon SageMaker JumpStart now supports model tuning with Sagemaker Automatic Model Tuning from its pre-trained model, pre-built solution templates, and example notebooks. This means customers can automatically tune their machine learning models to find the hyperparameter values with highest accuracy within the range customers provide through SageMaker API.
Read More for the details.
AWS – Amazon Braket adds support for Borealis, the first publicly accessible quantum computer that is claimed to offer quantum advantage
Amazon Braket, the quantum computing service from AWS, adds support for Borealis, a new photonic quantum processing unit (QPU) from Xanadu. The Borealis device is the first publicly available quantum computer that is claimed to have achieved quantum advantage: the technical milestone when a quantum computer outperforms the world’s fastest supercomputers on a well-defined task, in a peer-reviewed study published in the journal of Nature. Until now, none of the devices that have been claimed to demonstrate quantum advantage have been accessible to the public, but for the first time, customers can test a quantum advantage claim for themselves on Amazon Braket while also exploring potential applications for this technology.
Read More for the details.
AWS – NoSQL Workbench for Amazon DynamoDB adds support for CreateTable, UpdateTable, and DeleteTable operations
NoSQL Workbench for Amazon DynamoDB is a client-side application to help visualize and build scalable, high-performance data models. Starting today, NoSQL Workbench adds support for table and global secondary index (GSI) control plane operations such as CreateTable, UpdateTable, and DeleteTable.
Read More for the details.
AWS – You can now update the storage and IOPS capacity on your Amazon FSx for OpenZFS file systems
You can now update the storage and IOPS capacity on your Amazon FSx for OpenZFS file systems with the click of the button, making it even easier to adapt to your evolving storage and performance needs.
Read More for the details.
GCP – How Google Cloud is helping more startups build, grow, and scale their businesses
Today is our second-annual Google Cloud Startup Summit, and we couldn’t be more excited to connect with leaders, founders, and innovators from across the startup ecosystem to learn how they’re innovating and solving some of the world’s biggest challenges.
Startups come to Google Cloud to build their companies and products on our trusted, developer-friendly infrastructure, and to take advantage of our leading capabilities in areas like artificial intelligence (AI), machine learning (ML), and analytics. In January, we expanded our support for early-stage startups to make it even easier for them to get up-and-running on Google Cloud by covering their cloud costs up to $100,000 USD for two years, providing dedicated Startup Success Managers to assist along their journeys, and much more.
We’re excited today to announce several new offers, resources, and programs that will help even more startups build and grow their businesses.
Expanding early-stage financial support to include bootstrapped startups
The first of these new offers is that self-funded startups who apply to our Google for Startups Cloud Program can now receive $2,000 USD in Google Cloud credits to use over two years to help fund the development of proof-of-concepts and showcase their products to investors, talent, and customers.
In addition, bootstrapped startups may also qualify for discounts on Google Domains and Google Workspace, giving them the tools they need to help build their companies and products, and collaborate with their teams.
We’ll update our Google for Startups Cloud Program website in the coming weeks with more information and details about this offer.
Setting founders up for success with new Startup School workshops
We also know that an innovative product or vision can help startups stand out, particularly in the early days of development. That’s why in 2020, Google for Startups created Startup School — a comprehensive training program designed to equip early-stage startups and founders with the tools, products, and knowledge needed to support their growing company. The curriculum provides education on growth, technologies, and products, and covers topics like getting started with Google Cloud, making technical hires, preparing a data pipeline, and setting business objectives and key results (OKRs).
As an extension of this work, this fall we’ll be offering new, live workshops on a monthly basis that will be focused on topics like data analytics and application development, which will help startups build faster with tools and technologies tailored to their needs. These live Startup School sessions will also give founders and teams the opportunity to ask Google experts questions directly and discuss common challenges.
Supporting diverse founders and inclusivity across the globe
We remain committed to supporting a diverse ecosystem of startups, so we’re thrilled to host, in partnership with Inicio Ventures, 30 leading LatinX investors, founders, and ecosystem leaders in Miami next week during the first-ever Google for Startups LatinX Leaders Summit. At the summit, members of the LatinX startup community will have the opportunity to connect with key figures in the space and work together to develop more resources and ways to support startups in the LatinX community so they can overcome barriers and succeed, in the cloud and beyond.
We also recently announced that Google for Startups has teamed up with Visible Hands to run a 20-week fellowship program to support the next wave of early-stage LatinX founders across the United States. In addition to hands-on support from industry and Google experts, the program will provide $10,000 in cash for every participant to help kickstart their ideas. Participants in the program will also have the opportunity to receive additional investments from Visible Hands, up to $150,000. Click here to learn more about this program and apply.
And to help startups prioritize equity and inclusion in their business from day one, we’re proud to have Google DEI and Product Inclusion leads Alexandra Garcia and Steph Boudreau lead a session during the Startup Summit today, outlining how founders can build inclusive products and teams from the start.
Celebrating startups across the ecosystem
No matter the industry, solution, or geography, we’re proud of how startups are using Google Cloud technologies to create impactful customer experiences.
Take web3 startup IoTeX, whose blockchain-based platform handles millions of transactions for machines, devices, and people — with 99.9% platform reliability — by using Google Kubernetes Engine (GKE) to absorb load during surges, so the company can continue growing at scale. Or 3co, which uses Google Cloud technologies like Compute Engine to scale compute power quickly, and Tensorflow to provide 3D technology experiences for global e-commerce companies.
In Singapore, retailTech startup Palexy is using Google Cloud AI/ML products like Vertex AI alongside GKE to quickly build, test, deploy, and manage their applications, which in turn helps helps Palexy deliver solutions that help retailers use data and AI to sustainably grow their business and improve the in-store shopping experience for customers.
And for France-based social media startup BeReal, creating an impactful customer experience means giving people a platform to become closer to the friends and family they care most about. Using Google Firebase as the foundation to build the prototype, the BeReal platform now uses Cloud Functions for Firebase and GKE, so Alexis and the BeReal team can build and deploy new platform functionalities with speed as the business scales.
The flexibility made possible by the cloud—and greater access than ever before to bleeding edge capabilities in AI, ML, and analytics—are opening new doors for startups to innovate and grow, and we’re committing to helping them do so on Google Cloud.
We’re grateful to be on this journey with our startup customers, and we look forward to continuing to provide resources and facilitate discussions to better serve the specific needs in this space. Keep an eye out for our “Ask TK” blog series, launching later this June, in which Google Cloud CEO Thomas Kurian and other top thought leaders will address common questions and challenges faced by startups. You can also catch a few top questions, answered by Thomas, in today’s event.
To learn more about the Google for Startups Cloud Program, or to apply to get more support for your startup, please visit our program application page.
Read More for the details.
GCP – How SingleStoreDB uses Google Cloud Marketplace to drive great customer experiences
Founded in 2012, SingleStoreDB is a distributed SQL, cloud-native database that offers ultra-fast, low-latency access to large datasets — simplifying the development of modern enterprise applications. And by combining transactional (OLTP) and analytical (OLAP) workloads, SingleStoreDB introduces new efficiencies into data architecture.
We are a multicloud database, meaning that our customers can launch SingleStoreDB on any cloud infrastructure they choose. At the same time, our alignment with Google Cloud has been foundational to our success.
Let me tell you about how Google Cloud Marketplace has helped us grow our business from a startup in 2011 to one that 12 Fortune 50 companies and half of the top 10 banks in the U.S. rely upon today.
Filling a critical data niche to accelerate business value
First, let’s talk about why companies choose to use SingleStoreDB alongside Google Cloud and other IT solutions.
SingleStoreDB on Google Cloud is ideal for data-intensive applications with these five key requirements:
Query latency — Execute and receive query results with millisecond latencies second
Concurrency — Support a large number of users or concurrent queries, without sacrificing latency
Query complexity — Handle both simple and complex queries
Data size — Effortlessly operate over large data sets
Data ingest speed — Load (or ingest) data at very high rates, from thousands to millions of rows per second
Every customer approaches IT differently. Many are either taking a multicloud or hybrid approach to their infrastructure. By running on Google Cloud, SingleStoreDB helps customers manage workloads across all clouds and on-prem systems.
We find that many of our customers use SingleStoreDB as a single-pane view into all their data and infrastructure, as well as for data-intensive applications. Because SingleStoreDB is an analytical and transactional database, it shines when handling workloads that require outstanding accuracy and speed.
Especially when it comes to our larger customers, companies increasingly run SingleStoreDB on Google Cloud thanks to the Global Network Backbone. Because Google Cloud owns its global fiber network, we tap into Google’s secure-by-design infrastructure to protect information, identities, applications, and devices. This, alongside the increasing popularity of Google Cloud AI/ML solutions, makes SingleStore on Google Cloud a growing force.
Success on Google Cloud Marketplace
While the partnership between SingleStoreDB and Google Cloud solutions is paramount to our success, the availability of our solution on Google Cloud Marketplace amplifies customer satisfaction.
Google Cloud Marketplace simply makes it easier for partners to sell into Google Cloud customers. Here are some of the advantages we’ve seen since launching on Google Cloud Marketplace:
Our customers can easily transact and deploy via Marketplace, providing that true Software-as-a-Service feel for SingleStoreDB.
Transacting through Marketplace counts toward our customers’ Google Cloud commit, which helps partners like us tap into budget that customers are already setting aside for Google Cloud spend.
Billing and invoicing are intuitive and efficient, as everything is automated through Google Cloud Marketplace. Our customers receive a single, clear invoice from Google covering SingleStoreDB, Google Cloud, and other partner solutions purchased from Marketplace.
Many of our customers have previously signed the Google Cloud User License Agreement (ULA), which further expedites procurement by speeding up legal reviews.
Transforming how we sell
All of these advantages translate to a powerful change in our sales strategy. We no longer have to get bogged down by numbers and contracts. Instead, we can focus on educating customers about how the use of our solutions on Google Cloud will deliver returns to their business. Co-selling with Google Cloud has benefited our business significantly and provides customers with the best possible experiences.
We have also seen surprising advantages stem from our success on Google Cloud Marketplace. Our product and engineering teams are much more engaged with Google Cloud technologies now, putting a heavy emphasis on new integrations. We’re actively exploring new Dataflow templates, native connectors for BigQuery and Data Studio, and other solutions.
What SingleStoreDB customers are saying
To give you a better idea of how our customers are benefiting, here are some recent comments about running SingleStoreDB on Google Cloud and transacting through Google Cloud Marketplace:
“Our goal has and will always be to build our platform in a way that makes it feel like an on-premise solution. Speed of data processing and delivering are critical to providing this within our solutions. Both Google Cloud and SingleStore have helped us achieve this.” — Benjamin Rowe, Cloud & Security Architect, Arcules
“The SingleStore solution on Google Cloud allows marketing teams using Factors.ai to make more informed data decisions by organizing data in one system and allowing for self-serve analytics.” — Praveen Das, Co-Founder, Factors.ai
At the end of the day, our customer experiences and the business impacts they achieve with our solutions are our most critical KPIs. By partnering with Google Cloud, we have unlimited potential to improve our services to power the next generation of data and analytics applications.
Discover how SingleStore can transform your business on the Google Cloud Marketplace.
Read More for the details.
GCP – 5 ways retailers can evolve beyond traditional segmentation methods
Marketers have long devised ways to use data to predict behavior and drive personalization. Popular tactics include leveraging customer segmentation and transactional purchase data to recommend products and experiences based on attributes such as “people similar to you liked this” or “last viewed products” data. But is this the right way to go about improving and personalizing the customer experience? Can retailers do better?
Today, consumers are interacting and engaging with brands in many different ways, especially as the dynamics of ecommerce have shifted over the past two years. For many retailers that are managing disruption in the current environment, every day is like Black Friday, with new customers they know nothing about and current customers who are shopping differently across and within categories. Retailers need to spend wisely for new customer acquisition and, since acquisition is often more costly than retention, they need to incorporate intelligent methods to build lifetime loyalty. Retailers should look ahead to the next generation of the digital experience, try to understand how the customer journey has changed, and move past mere personalization to build more empathy into the process.
That was the focus of a recent eMarketer Tech-Talkfeaturing the visionary founders of three technology companies, all of whom are Google Cloud partners that are focused on helping retailers activate and harness their data better: Fayez Mohamood, co-founder and CEO of Bluecore; James McDermott, co-founder and CEO of Lytics; Mario Ciabarra, founder and CEO of Quantum Metric. Our colleague Carrie Tharp, VP of Retail and Consumer at Google Cloud, moderated the conversation. Here are five key takeaways.
1. Think beyond customer segmentation and demographics. Topping the list is the drive toward using data and technology to change the customer experience as it’s happening. “Customers don’t go on linear journeys; they live in micro moments,” Mohamood from Bluecore commented. They interact with the brand across multiple touch points. Retailers need to understand that and engage customers in the moment through personalization and recommendations. That is a key area that Bluecore, an AI-driven retail marketing platform, is helping retailers focus on: activating product catalog and customer data across digital channels to drive repeat purchases and grow revenue. “It’s time to think past the classic focus on customer segmentation and demographics,” Tharp added. The trick is to understand people as individuals whose purpose in their different interactions with a brand varies wildly.
2. Democratize data so that those who are designing the customer experiences can use it in real time. With the advent of the cloud-based customer data platform (CDP), retailers can more quickly build and act on customer profiles using different types of intelligence relative to the customer’s intent and interests. “That’s the starting point that enables us to listen and understand their real-time behavior,” McDermott from Lytics advised. “But building profiles is not the end goal; how will you act on that data? How are you making the experience better for your customers?” McDermott co-founded Lytics, which focuses on behavioral and intent-based analytics, after discovering a need in the market to bridge the gap between data and action. He suggested beginning with the end in mind, with a use case to illustrate.
Doing that isn’t simple. Deciding which data has value and how to use it is the crux of the challenge. The good news is that the CDP has made more data accessible to data scientists, who are using it to build models that can be turned into systems of engagement. Internal teams are experimenting and testing continuously, with a rich set of hypotheses to figure out how to prioritize the signals and determine which experiences will actually drive better interactions. However, as Mohamood observed, “That’s only one piece of the puzzle, because data scientists are typically not actually creating the experiences that customers have on channels – whether it’s websites, email, SMS, and so forth. We need to make data democratic to marketers and business users who need it in real time to test and learn.”
3. Leverage artificial intelligence (AI) to anticipate behavioral and inventory changes. Rich product data and attribution are as important as customer data, Mohamood remarked. “Shoppers change, and then the data changes, and then inventory changes, and the data changes yet again. Your team has to be nimble to react to those signals.” That’s where AI comes in – augmented with human insight and intervention.
Some retailers have a constantly changing inventory mix, and can match what’s in stock to the customer intent. They adjust their messages to the customer accordingly, without changing the frequency of communications, thereby improving repeat purchase rates. Ciabarra from Quantum Metric also talked about the importance of AI to react to changes faster. “I know it sounds crazy, but real time really means real time. To allow me to go and ask questions of the data that are really complex and be able to come out with answers and build that analysis into the product, I think we’re going to see that unlock of AI and ML.” Quantum Metric’sfocus with continuous product design is all about building analysis into and across the digital product design process so that teams can iterate faster and deliver the products that customers truly want.
In addition, Tharp pointed out that many retailers are adding marketplace capabilities to their sites, evolving from static inventory levels to “an expansive, endless aisle.” However, they are struggling to get the right operating model in place across marketing, analytic, and digital teams. How can retailers think about their operating model related to personalization and data?
4. Build the link between customer data and empathy. In short, the operating model has to support the full connection of all relevant interactions – for example, combining past purchase data with intent data, browsing data, and mobile app data. That requires the ability to integrate multiple data sets and operationalize higher-performing personalization. The technology stack is being adapted to be oriented around the customer, McDermott said, with the architecture becoming simpler and more centralized in the cloud data warehouse. One clear trend is that the CDP has become the central repository for customer data, with more intelligence built in using AI and machine learning.
A key goal is to create a more “one-to-one” connection with customers by enabling retailers to gain real-time visibility into the customer’s selection process and how they are doing in the checkout. “We have to have that level of empathy. I like to use the words ‘quantified empathy’,” Ciabarra said. “In a store we had that one-on-one connection, but online, it could be one marketer to a million customers. How do you get real-time visibility across the organization so they can consume and act on it? Everyone has to have a shared view.”
5. Transform the organization and break down silos. Just about every company today is focused on becoming more customer-centric. But the great majority are structured around specialized teams that function in silos with their own perspectives and maybe even different objectives. “Data tends to be held hostage by a few folks in the organization,” Mohamood noted. Enabling a shared view across the organization among marketers, merchandising, and digital teams requires a gradual cultural shift – in the best case, with executive buy-in.
Some companies promote collaboration by assembling pods that include marketers, data scientists, analysts, and creative people testing and learning together. Another promising development: marketers and data scientists are proactively partnering with finance and operations teams, especially around KPI design. They are working together to design metrics that are customer focused, rather than channel-focused, and reexamining the technology stack from that point of view. The mindset is evolving toward looking at data not from the perspective of marketing or product or UX and design, but through the customer lens, pulling the data together in a way that’s more usable.
We invite you to watch the full webinarto derive even more insights from this thoughtful discussion. In addition, learn more about Bluecore, Lytics, and Quantum Metric solutions all available on the Google Cloud Marketplace.
Read More for the details.
GCP – Understanding Google Cloud’s VMware Engine Migration Process and Performance
Google Cloud VMware Engine (GCVE) allows a user to deploy a managed VMware environment within an Enterprise Cloud Solution. We’ve put together a new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,” to help our customers better understand the architecture, its performance, and the benefits. If you’re not familiar with Google Cloud VMware Engine yet, let’s talk a bit more about it.
Utilizing Google Cloud lets you access existing services and cloud capabilities; one of those services and solutions mentioned within this document is our Hybrid Cloud Extension, also known as HCX. HCX provides you with an easier transition from on-prem to the cloud, allowing systems administrators to quickly deploy a private cloud and scale their needed Virtual Machines accordingly. The proposed referenced solution is well suited for organizations looking to begin their cloud migration journey and understand the technical requirements within the process without having to be fully committed to their cloud strategy or evacuation data center strategy.
Currently, many organizations are navigating their way through their current IT challenges and cloud solutions. Google Cloud VMware Engine provides you the “easy on-ramp” to migrate your workloads into the cloud. You don’t have to move everything to the cloud at once, though, because GCVE provides the option to scale your IT infrastructure from on-prem to the cloud at your discretion by leveraging HCX.
HCX also lets you migrate a virtual machine from on-premise to the cloud via a VPN or internet connection without any additional downtime or having to save their work and log off of their machine. With GCVE, you can continue to work during your business hours and operations while your systems administrators migrate your teams to the cloud without the downtime associated with virtual machine migration.
The ability to migrate a virtual machine from on-prem to the cloud raises another question: how fast can a targeted virtual machine migrate to the cloud? Google analyzed this specific scenario, assessing what the requirements were to migrate an on-prem virtual machine to the cloud via a Virtual Private Network (VPN), and then analyzing how fast that connection was established and transmitted through HCX.
The answer to that question—and more—is all contained within our brand new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,“ which you can download now. And if you’re ready to get started with your migration efforts, sign up for a free discovery and assessment with our migration experts.
Read More for the details.
GCP – Google Distributed Cloud adds AI, ML and Database Solutions to deliver customers even greater flexibility and choice
Organizations need a cloud platform that can securely scale from on-premises, to edge, to cloud while remaining open to change, choice, and customization. They must be able to run their applications wherever they need, on infrastructure optimized to process a very high volume of data with minimal delay, all while maintaining the satisfaction and stability of their ML-driven user experiences. At Google, we deeply understand these customer requirements, which is why we launched Google Distributed Cloud last year. With Google Distributed Cloud, we bring a portfolio of fully managed solutions to extend our infrastructure to the edge and into customers’ own data centers.
Today, Google Cloud customers love our artificial intelligence (AI) services for building, deploying and scaling more effective AI models with developers successfully using our core machine learning (ML) services to build and train high-quality custom ML models. Our customers are also using a variety of our managed database solutions because of their simplicity and reliability. However, some customers have highly sensitive workloads and desire to use their own private, dedicated facilities.
For these customers, we’re excited to announce they will be able to run a selection of these same AI, ML, and database services in Google Distributed Cloud Hosted inside their own data centers within the next year. With this announcement, customers get to take advantage of Anthos, a common management control plane which provides a consistent development and operations experience across hybrid environments. This same experience is now available for on premise environments.
Our portfolio of AI, ML, and database products enable customers to quickly deploy services with out-of-box simplicity that includes delivering valuable insights through both unstructured and structured data. The integration of our Google Cloud AI and database solutions into the Google Distributed Cloud portfolio means the ability to harness real-time data insights like never before due to the proximity to where the data is being generated and consumed. This includes ensuring low latency to support applications that are mission critical to businesses such as computer vision, which can be used on the factory floor to detect flaws in products or to index large amounts of video. The addition of these transformative capabilities allow customers to save money, innovate faster and provide the greatest flexibility and choice.
With this integration, customers using Google Distributed Cloud Hosted will have access to some exciting AI features. One example is our Translation API that can instantly translate texts in more than one hundred languages. Translation API is a feature available in Vertex AI, our managed ML platform that is generally available and allows companies to accelerate the deployment and maintenance of AI models. With this announcement, customers who need to run highly sensitive workloads in an on-premise or edge environment can now leverage the unique functionality of Translation API along with other Google Cloud pre-trained API’s in Vertex AI such as Speech-to-Text and optical character recognition (OCR). These features were all trained on our planet-scale infrastructure to deliver the highest level of performance, and as always, all of our new AI products also adhere to our AI Principles.
Additionally, by incorporating our managed databases offering onto the Google Distributed Cloud portfolio, customers can process data locally to migrate or modernize applications, opening up more time for innovation and to create value in their applications. This is especially true in industries like financial services and healthcare where there are compliance requirements on where data can reside.
With these new AI, ML and databases products available in our Google Distributed Cloud portfolio, customers will still have full authority to maintain autonomy and control over their own data centers, yet can rely on Google for the latest technology innovations in cloud services.
For more information, please visit Google Distributed Cloud, and to learn more about Vertex AI specifically, join us at our Applied ML Summit.
Read More for the details.
GCP – Cloud CISO Perspectives: May 2022
May was another big month for us, even as we get ready for more industry work and engagement at the RSA Security Conference in San Francisco. At our Security Summit and throughout the past month, we continued to launch new security products and features, and increased service and support for all our Google Cloud and Google Workspace customers.
Google Cloud’s Security Summit 2022
Our second annual Security Summit held on May 17 was a great success. In the days leading up to the Summit, we discussed how we are working to bring Zero Trust policies to government agencies, and we revealed our partnership with AMD to further advance Confidential Computing – including an in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family.
We also introduced the latest advancements in our portfolio of security solutions. These include our new Assured Open Source Software service (Assured OSS), which enables enterprise and public sector users of open source software to incorporate the same OSS packages that Google uses into our own developer workflows; extending Autonomic Security Operations (ASO) to the U.S. public sector, a solution framework to modernize cybersecurity analytics and threat management that’s aligned with the Zero Trust and supply-chain security objectives of 2021’s cybersecurity Executive Order and the Office of Management and Budget memorandum; expanding our compliance with government software standards; and SAML support for Workload Identity Federation, so that customers can use a SAML-based identity provider to reduce their use of long-lived service account keys.
Advancing open source software security
We continued to partner with the Open Source Security Foundation (OpenSSF,) the Linux Foundation, and other organizations at another industry open source security summit to further develop the initiatives discussed during January’s White House Summit on Open Source Security. We’re working towards the goal of making sure that every open source developer has effortless access to end-to-end security by default.
As covered in our Security Summit, an important part of this effort is Assured OSS, which leverages Google’s extensive security experience and can help organizations reduce their need to develop, maintain, and operate complex processes to secure their open source dependencies. Assured OSS is expected to enter Preview in Q3 2022.
Also, as part of our commitment to improving software supply-chain security, the Open Source Insights project helps developers better understand the structure and security of the software they use. We introduced Open Source Insights data in BigQuery in May so that anyone can use Google Cloud BigQuery to explore and analyze the dependencies, advisories, ownership, license and other metadata of open-source packages across supported ecosystems, and how this metadata has changed over time.
Why Confidential Computing and our partnership with AMD matters
I’d like to take a moment to share a bit more on the importance of Confidential Computing and our partnership with AMD. I’ve been talking a lot this year about why we as an industry need to evolve our understanding of shared responsibility into shared fate. The former assigns responsibilities to either the cloud provider or the cloud provider’s customer, but shared fate is a more resilient cybersecurity mindset.
It’s a closer partnership between cloud provider and customer that emphasizes secure-by-default configurations, secure blueprints and policy hierarchies, consistently available advanced security features, high assurance attestation of controls, and insurance partnerships.
In our collaboration with AMD, we focused on how secure isolation has always been critical to our cloud infrastructure, and how Confidential Computing cryptographically reinforces that secure isolation. AMD’s firmware and product security teams, Google Project Zero, and the Google Cloud Security team collaborated for several months to analyze the technologies and firmware that AMD contributes to Google Cloud’s Confidential Computing services.
Also in May, we expanded the availability of Confidential Computing to include N2D and C2D Virtual Machines, which run on third-generation AMD EPYC™ processors.
GCAT Highlights
Here are the latest updates, products, services and resources from our cloud security teams this month:
Security
PSP protocol now open source: In order to better scale the security we offer our customers, we created a new cryptographic offload protocol for internal use that we open sourced in May. Intentionally designed to meet the requirements of large-scale data-center traffic, the PSP Protocol is a TLS-like protocol that is transport-independent, enables per-connection security, and is offload-friendly.
Updating Siemplify SOAR: The future of security teams is heading towards “anywhere operations,” and the latest version of Siemplify SOAR can help get us there. It gives organizations the building blocks needed across cloud infrastructure, automation, collaboration, and analytics to accelerate processes for more timely responses and automated workflows. In turn, this can free up teams to focus on more strategic work.
Guardrails and governance for Terraform: Popular open-source Infrastructure-as-Code tool Terraform can increase agility and reduce errors by automating the deployment of infrastructure and services that are used together to deliver applications. Our new tool verifies Terraform and can help reduce misconfigurations of Google Cloud resources that violate any of your organization’s policies.
Benchmarking Container-Optimized OS: As part of our security-first approach to safeguarding customer data while also making it more scalable, we want to make sure that our Container-Optimized OS is in line with industry-standard best practices. To this end, the Google Cloud Security team has released a new CIS benchmark that clarifies and codifies the security measures we have been using, and makes recommendations for hardening.
New reCAPTCHA Enterprise guidebook: Identifying when a fraudster is on the other end of the computer is a complex endeavor. Our new reCAPTCHA Enterprise guidebook helps organizations identify a broad range of online fraud and strengthen their website security.
Take the State of DevOps 2022 survey: The State of DevOps report by Google Cloud and the DORA research team is the largest and longest running research of its kind, with inputs from more than 32,000 professionals worldwide. This year will focus on how security practices and capabilities predict overall software delivery and operations performance, so be sure to share your thoughts with us.
Industry updates
Security improvements to Google Workspace: I wrote at the beginning of the year that data sovereignty is one of the major, driving megatrends shaping our industry today. At the beginning of May we announced Sovereign Controls for Google Workspace, which can provide digital sovereignty capabilities for organizations, both in the public and private sector, to control, limit, and monitor transfers of data to and from the EU starting at the end of 2022, with additional capabilities delivered throughout 2023. This commitment builds on our existing Client-side encryption, Data regions, and Access Controls capabilities.
We are also extending Chrome’s Security Insights to Google Cloud and Google Workspace products, as part of our efforts to consistently provide advanced features to our customers.
Can you hear the security now? Pindrop is joining forces with Google Cloud. If you’ve never heard of Pindrop, you’ve almost certainly encountered their technology, which is used to authenticate payments, place restaurant and shopping orders, and check financial accounts over the phone. Their technology provides the backbone for anti-fraud efforts in voice-based controls, as well. With Google Cloud, Pindrop can be better able to detect deep fakes and robocalls, help banks authenticate transactions, and provide retailers with secure AI-powered call center support.
Compliance & Controls
Expanding public sector and government compliance: Google Cloud is committed to providing government agencies with the security capabilities they need to achieve their missions. In addition to our aforementioned Autonomic Security Operations and new Assured Open Source Software (OSS) service, we’re expanding Assured Workloads. This can help enable regulated workloads to run securely at scale in Google Cloud’s infrastructure. We are also pleased to announce that 14 new Google Cloud services support FedRAMP Moderate and three services are being added to support FedRAMP High, with more coming this summer. (You can read the full list of those services at the end of this blog.)
Next month we’ll recap highlights from the RSA Conference and much more.
To have our Cloud CISO Perspectives post delivered every month to your inbox, sign up for our newsletter. We’ll be back next month with more security-related updates.
Read More for the details.
Azure – General availability: Storage optimized Azure VMs deliver higher performance for data analytics.
The new Lasv3 and Lsv3 VM series are well suited for high throughput and high IOPS workloads including big data applications, SQL and NoSQL databases, data intensive applications and more.
Read More for the details.
