Azure – Generally available updates for early March 2023
General availability enhancements and updates released for Azure SQL in early March 2023
Read More for the details.
General availability enhancements and updates released for Azure SQL in early March 2023
Read More for the details.
Amazon DocumentDB (with MongoDB compatibility) continues to increase compatibility with MongoDB, and now offers added support for MongoDB 5.0 drivers with Amazon DocumentDB 5.0. Amazon DocumentDB is a fast, scalable, highly available, and fully managed document database service that supports MongoDB API based workloads. Amazon DocumentDB makes it easy and intuitive to store, index, and query JSON data.
Read More for the details.
We like hearing feedback, and you’re not shy about giving it to us. Excellent! Each day, you reach us through our Twitter account, Innovator’s Program forums, and even when we meet in person at industry events. Your feedback helps us better understand what you need, what’s not working right, and how we can make your cloud experience better.
Another way we hear from our customers? Independent research. Third-party research allows us to get a better understanding of how our products stack up against the competition. We just finished some new research that I want to tell you about. What did we find? Google Cloud is making it easier for developers everywhere to do what they love, faster.
We partnered with User Research International to conduct research with the goal of quantifying how fast you can do things in Google Cloud compared to using our major competitors in the cloud industry. We ran 45, two-hour sessions to gather more information on how customers are using products like GKE Autopilot, Cloud Run Jobs, and Cloud Deploy against that of equivalent competitor services offered by AWS and Azure. Of the 45 participants, there was a nearly even distribution between those who identified themselves as Google Cloud, AWS or Azure experts — meaning that was the main cloud provider they perform daily tasks on. Being able to understand how developers, who are not native to our platform perform, was crucial to getting a real world understanding of what customers need and how well cloud products address those needs.
Nowadays developers get dragged into so many activities like operations, security, and managing infrastructure, when all they want to focus on is coding. All these distractions reduce developer productivity and increase the need to learn yet another skill. That is why we chose to focus on three of our products that abstract away much of the complexity of cloud development.
With tools like GKE Autopilot, you no longer have to be an expert at Kubernetes or containers to be successful. Autopilot mode is a hands-off, fully managed service that manages your entire cluster’s infrastructure without you worrying about configuring and monitoring, while still delivering a complete Kubernetes experience. With GKE Autopilot, deploying a containerized application on Google Cloud is up to 2.6x faster than competing managed Kubernetes offerings1.
The Kubernetes learning curve can be a bit steep, but GKE Autopilot makes it easier. To learn a bit more about containers and Kubernetes, here are two of my favorite tutorials that can help kick of the journey to deploy your first GKE application:
Deploy a containerized web application. This is a full-fledged tutorial you can walk through.
Interactive tutorial: GKE Autopilot. Here’s one of dozens of in-console tutorials that walk you through GKE, step by step.
While many of us like using Kubernetes to run containerized workloads, serverless options offer another outstanding host for our apps. You can go from source to deploy in one command with Cloud Run — and that’s just one of the tools that speed up your development.
Unlike Cloud Run services that run continuously to respond to web requests or events, Cloud Run jobs run code that performs some work and quits when the work is done. In fact, with tools like Cloud Run jobs, developers deploy and execute serverless jobs 50% faster than the nearest competition1. Cloud Run jobs are a good fit for administrative tasks such as database migrations, scheduled work like nightly reports, or doing batch data transformation.
Looking to get started with Cloud Run Jobs today? Check out this simple tutorial the team built to enable you to create and execute a job.
Administrative tasks are automated by Cloud Runs job, but Cloud Deploy helps to automate DevOps-style continuous delivery tasks like release promotion and approval; our research showed that these tasks are 2.4x faster on Google Cloud than with competing cloud services1. Cloud Deploy is a managed and opinionated continuous delivery service that makes continuous delivery to GKE easier, faster, and more reliable.
Whether you’re using GKE or Cloud Run, Cloud Deploy simplifies the continuous delivery journey by providing a fully managed service that includes easy one-step promotion and rollback of releases via the web console, CLI, or API. Get started today with Cloud Deploy by checking out these tutorials that I’ve found helpful:
Deploy an app to GKE using Google Cloud Deploy. This complete walkthrough helps you deploy an app to GKE.
Interactive Tutorial: Cloud Deploy End-to-End. This guided experience helps you complete every step.
Our goal is to make it easier to build, deploy, and run apps that matter. Our independent research helped show that many of our cloud services help you deliver value faster, while giving you more time to focus on the fun parts of app development.
Never hesitate to give us feedback on what you need to be even more inspired and successful with the cloud, and, don’t forget to join our Innovator’s program to stay up to date on all the latest events and connect with other learners!
1. Google Developer Experience – Competitive Benchmark Report 2022 by User Research International
Read More for the details.
Retail companies put geospatial data to use to solve all manner of challenges. Data that is tied to location is vital to understanding customers and solving business problems. Say you’re a Director of Leasing who needs to choose the location for a new store. You’ll need to know your potential customers’ demographics, the products being sold by competitors, foot-traffic patterns — but all that data would be essentially useless if it wasn’t tied to a spatial location.
Adding the spatial dimension to your data unlocks new potential, but also adds a degree of complexity. Geospatial data requires map-based visualizations, unique functions and procedures, and far more storage space than your average data. This is why Location Intelligence platforms like CARTO, and peta-byte scale data warehouses like BigQuery are an essential part of a business that wants to use geospatial data to solve their business problems.
CARTO for Retail is a set of platform components and analytical functions that have been developed to assist Retail companies with their geospatial data. The CARTO for Retail functions are deployed directly in BigQuery, and combine spatial analytics with BigQuery Machine Learning tools to run predictions and analysis in the same location as your data. TheCARTO for Retail Reference Guide goes into extensive detail about this solution, which we’ll dive into below.
The CARTO platform provides a set of capabilities that, when combined with the processing power of BigQuery, form the CARTO for Retail Solution. Below is an illustration of the components in play:
Visualization
The CARTO Builder is a web-based drag-and-drop analysis tool that allows you to quickly see your geospatial data on a map. You can do discovery analyses with the built-in spatial functions, which push the processing down to BigQuery without any configuration required on your end. If you do want to put your hands on the commands that are sent to BigQuery, you can open the SQL interface and edit the code directly. This makes the CARTO Builder an excellent tool for rapid-prototyping geospatial applications.
Once you’re ready to add advanced application features like fine-grained access controls or custom filtering, you can add to your prototype’s code using the deck.gl framework (this is what CARTO uses on the backend) and CARTO for React. They also provide some helpful templates at that link to get you started.
Data
While most companies generate some of their own geospatial data, very few have the means (think hundreds of thousands of drones in the sky on a daily basis) to generate the full picture. How about adding some location data to your location data? CARTO’s Data Observatory provides curated 3rd party data (some free, most for purchase) including socio-demographics, Points of Interest, foot & road traffic, behavioral data or credit card transactions. All the data is already hosted in BigQuery, so it’s easy to merge with your existing data. BigQuery itself also has a number of publicly available geospatial datasets, including OpenStreetMap.
Analytics
There are a series of retail-specific User Defined Functions and stored procedures within the CARTO Analytics Toolbox. These procedures, called the CARTO’s Analytics Toolbox for BigQuery can be accessed through the CARTO platform, or directly in the BigQuery console. Leveraging the massive computing power of BigQuery, you can do the following analyses:
Clustering
Analysis to Identify the optimal store locations by geographically clustering customers, competitors and existing stores.
Commercial Hotspots
Models to focus on the most effective areas for expansion, based on the surrounding retail fabric.
Whitespace Analysis
Routines to identify the best potential locations, where expected revenue is higher than top performing stores, and where key business criteria are met.
Twin Areas Analysis
ML-driven analytics to focus network expansion strategies on the most similar locations to the best performing stores.
Store Revenue Prediction
A trained Machine Learning model to predict the annual revenue of a planned store location.
Store Cannibalization
A model to estimate the overlap of areas and spatial features that a new store would have with the existing store network.
Now let’s see the CARTO for Retail components in action. Our goal for this example is to identify similar areas (known as twin areas) in Texas to match a particular high-performing store in Austin. We first create a connection to BigQuery using a service account.
Next, we need to create our data model using the carto.BUILD_REVENUE_MODEL_DATAfunction. This function takes in stores, revenue data, and competitors, then creates an evaluation grid to find twin areas, trade areas (which can be any polygon such as a radius, drive time, or custom created polygon), and desired enrichment variables. Below is an example of this function:
Next, we need to build out the revenue model using carto.BUILD_REVENUE_MODEL. This uses BigQuery ML to perform the model predictions and supports LINEAR_REG and BOOSTED_TREE_REGRESSOR. Check the model documentation for more information.
This will output the model, SHAP values, and model statistics to understand the model performance. Below is an example query to run this:
Finally, we can predict our twin areas. We pick a target index which we can identify from our map. As we can see here, this cell is the top performing store we want to find similar areas to.
From here we can run our Twin Areas model, which is based on Principal Component Analysis (PCA). We provide a query containing our target H3 cell, a second query of the cells we want to target to study (any cell without a store in Texas), and several other arguments to fine tune our results:
The result is this interactive map which shows us the top areas that will likely perform the most similar to our target store based on our geospatial factors. We can also include other store factors in our first step to add site specific details like square footage or year built.
Believe it or not, there are even more tools and functions that can help you make the most of your geospatial data, which are explored in the CARTO for Retail Reference Guide. There’s the Bigquery Tiler, and CARTO’s out-of-the-box Site Selection Application, which includes relevant 3rd party data, advanced map visualizations and embedded models to pinpoint the best locations for network expansion.
In addition to the CARTO Analytics Toolbox, BigQuery also has many additional GIS functions for analyzing your geospatial data. Check out this blog on optimizing your spatial storage clustering in BigQuery. If you’re looking to analyze raster data, or can’t find the dataset you need in CARTO’s data observatory, consider trying Google Earth Engine.
Read More for the details.
Communications Service Providers’ (CSP) services have never been more in demand from consumers and businesses around the world. But to meet that demand, as well as drive new growth and differentiation in the midst of industry disruption, CSPs must themselves transform — including restructuring the cost base of their technology estate, unlocking the full power of their data, innovating the customer experience, and reimagining how they build, deploy, and operate their networks.
At Google Cloud, we believe that co-innovation with a broader ecosystem of customers and partners is critical to accelerating industry transformation. And so it was fantastic to be back in Barcelona at MWC 2023 this week as over 80,000 people from across the wider telecommunications landscape and adjacent industries came together to reimagine the future of the industry. Read on for a comprehensive list of all the announcements we made at the show.
This week at MWC 2023, we were excited to announce three new telecom products to help CSPs digitally transform their networks with hybrid-cloud principles and unlock new revenue opportunities. Together, the new telecom products will offer CSPs a unified cloud solution that helps them build, deploy, and operate their hybrid cloud-native networks; collect and manage network data; and improve customer experiences through artificial intelligence (AI) and analytics.
In addition, Google Cloud is providing CSPs the ability to deploy true cloud-native networks in a hybrid cloud environment, and run their network functions in any Google Cloud region on Google Kubernetes Engine (GKE).
Finally, we are very excited to see how the RAN segment of the Industry is evolving, and we have extended our edge cloud capabilities so that our partners can deploy these types of high-performance applications on our Google Distributed Cloud Edge.
As part of MWC 2023, we were also excited to highlight more on how we are collaborating with customers and partners from around the world to drive transformative business outcomes:
Together with Deutsche Telekom and Ericsson, we shared a significant network transformation milestone, with the successful deployment of Ericsson 5G Core cloud-native network functions (CNFs) on an on-premises implementation of Google Distributed Cloud Edge.
We announced our new collaboration with StarHub to help bolster their Cloud Infinity transformation program, and support Starhub as they create an open, secure, scalable, and energy-efficient cloud-native network for enterprises in Singapore.
We also showcased our ability to help CSPs leverage AI to drive new business outcomes, highlighting how Orange has been partnering with Google Cloud to deploy their data in new and effective ways, using real AI to solve concrete business challenges.
Finally, we partnered with Analysys Mason to explore new industry insights on how CSPs are thinking and acting on their cloud-native network journeys.
It’s been an action-packed week of customer and partner discussions, demos across our Google Cloud and partner stands, keynote presentations and fireside chats, customer, partner and product announcements, and so much more. As we reflect back, it’s clear the industry is focused not only on what the next-generation of telecommunications looks like, but how to get there together.
Thank you to all of our customers, partners, and Google teams for a truly inspiring week! At Google Cloud, we firmly believe that we are only successful when others are successful. We look forward to the collaboration that lies ahead, and the deep ecosystems that we will create together.
We would also like to thank the City of Barcelona for welcoming us once more, and Museu Europeu d’Art Modern (MEAM) with whom we collaborated to showcase artwork of Barcelona Academy of Art students inside our Google Cloud MWC stand and reception.
To learn more about how Google Cloud is partnering with CSPs, check out our Google Cloud for Telecommunications page.
Read More for the details.
A new year means a new start to re-evaluate priorities, set goals, and prepare your business for the upcoming months. IT leaders at companies of every size and shape face a critical challenge: how to make progress on multiple major initiatives at the same time. You are working to drive productivity, mitigate security threats, and manage your total cost of ownership.
We face a growing list of obstacles:
The average cost of a data breach is now topping $4.35 million.1
Only 10% of IT managers’ time is spent on strategy and innovation.2
77% of employees say they have experienced burnout at their jobs.3
Your choices matter in setting up your organization for success; maximizing sustainable value by selecting the right operating system that drives productivity, saves costs, and gives peace of mind at every endpoint is essential in 2023.
ChromeOS prepares businesses for success in all circumstances
Google commissioned IDC, a leading third-party research firm, to conduct a study on the business value of ChromeOS across a number of use cases, including kiosks and digital signage, contact centers, virtualization (VDI), and hybrid and remote work.
IDC’s research established that ChromeOS is a cost-effective, efficient, secure, and reliable operating system that delivers many benefits, from lower costs and higher ROI to a range of operational improvements.4
Businesses that used ChromeOS and ChromeOS devices saw an average 245% return-on-investment (ROI), 44% lower cost of operations, and $3,901 total savings per device over three years.4 ChromeOS is 63% faster to deploy than other operating systems used by study participants;4 in fact, ChromeOS devices boot up in as little as six seconds, and updates install seamlessly in the background. Not only is ChromeOS fast to deploy, it’s also secure, with ChromeOS having 24% fewer security attacks than devices using other operating systems.4There have been zero reported ransomware attacks on ChromeOS devices, ever.
Based on these findings, we believe that ChromeOS is the best choice to help you maximize cost savings, optimize operational efficiency, and boost employee productivity and security.
An operating system for every use case
In every examined use case, ChromeOS helped benefit organizations and employees:
Kiosk & Signage: Organizations who used ChromeOS and ChromeOS devices for kiosk and digital signage experienced 34% faster resolution of outages, 26% fewer outages, and 34% lower cost of digital signage.5Hybrid & Remote Work: ChromeOS helped drive 14% higher productivity of hybrid and remote workers and 57% faster device deployment.6Contact Centers:Contact centers using ChromeOS saw 19% higher productivity of their agents and 33% faster resolution of support tickets.7Virtualization (VDI):Deploying ChromeOS for VDI helped drive 31% higher productivity of employees, 150% more VDI applications deployed per year, and 43% faster deployment of new VDI applications and features.8
Tune into our upcoming webinar on business resilience
These are just some of the ways that ChromeOS is helping businesses thrive. To learn more about how ChromeOS can help your business maximize sustainable value and financial outcomes, attend our upcoming webinar, Resilience Starts with IT, on March 9th, 2023. You’ll hear firsthand from Robyn Powell, Vice President of IT, HOTWORX and Mark Eimer, Senior Vice President, Associate CIO and CTO, Hackensack Meridian Health on how to prepare for success in any economic climate. We hope you tune in.
Read More for the details.
Migrate your existing in-tree disk and file volumes to CSI drivers using provided guidance.
Read More for the details.
Amazon Detective finding groups now include a dynamic visual representation of Detective’s behavior graph to emphasize the relationships between security findings and the associated entities within a finding group. This addition makes it easier for customers to triage potential security issues with at-a-glance visuals that include finding types, severity levels, associated account(s), and linkages within the Detective behavioral graph that can be used to investigate related activity within a finding group.
Read More for the details.
Starting today, Amazon Aurora MySQL-Compatible Edition 3 (with MySQL 8.0 compatibility) supports MySQL 8.0.26. In addition to several security enhancements and bug fixes, MySQL 8.0.26 includes several changes, such as enhanced tablespace file segment page configuration and new aliases for certain identifier names. For more details, please review Aurora MySQL 3 and MySQL 8.0.26 release notes.
Read More for the details.
Amazon DevOps Guru for RDS now supports Proactive Insights, a new set of findings that inform you of impending database performance and availability issues before they become critical. With Proactive Insights, Amazon DevOps Guru for RDS continuously monitors database instances for potential issues that can lead to degraded database health in the future. When such conditions are detected, Proactive Insights will generate a finding that describes the nature of the impending problem, and specific actions you can take to mitigate it. Proactive Insights are available for Amazon DevOps Guru customers, with no additional setup or configuration needed. Proactive Insights are supported for Aurora PostgreSQL and Aurora MySQL database engines at the moment.
Read More for the details.
Azure Network Watcher announces enhanced connection troubleshoot helping resolve network connectivity issue comprehensively along with actionable insights
Read More for the details.
Reduce spending by storing rarely accessed data to Azure Archive Storage, now in new regions.
Read More for the details.
Use new burstable compute in single node configurations to start small with distributed Postgres and scale out later.
Read More for the details.
Integrate Power BI with Azure Database for MySQL – Flexible Server directly from the Azure portal to quickly get started with building your data visualizations!
Read More for the details.
Use the new migration tool to migrate workloads from Single Server to Flexible Server on Azure Database for PostgreSQL, a managed service running the open-source Postgres database on Azure.
Read More for the details.
Azure for Operators announces the next wave of services to empower operators to modernize and monetize their 5G investments, enable enterprises with ubiquitous computing from cloud to edge.
Read More for the details.
Interact with Azure Virtual Network Manager (AVNM) event logs for network group membership changes.
Read More for the details.
Starting today, Amazon Neptune Serverless has lowered its minimum scaling requirements to 1 Neptune Capacity Unit (NCU) from 2.5 NCUs. This brings down the cost of running Neptune Serverless by reducing the resources used by up to 2.5X when the graph database is not actively responding to user queries. With a 1 NCU minimum, you could save on your graph database by using Neptune Serverless.
Read More for the details.
Amazon Relational Database Service (RDS) for SQL Server now supports Cross Region Automated Backups with encryption. The Amazon RDS Cross-Region Automated Backups feature enables disaster recovery capability for mission-critical databases by providing you the ability to restore your database to a specific point in time within your backup retention period. This allows you to quickly resume operations in the event that the primary AWS Region becomes unavailable. If you’ve enabled encryption on the source RDS for SQL Server DB instance, you can use this feature to copy encrypted DB snapshots to regions outside of the primary AWS region.
Read More for the details.
Amazon Kinesis Data Streams now supports an increased On-Demand write throughput limit to 1 GB/s, a 5x increase from the current limit of 200 MB/s. Amazon Kinesis Data Streams is a serverless streaming data service that makes it easy to capture, process, and store streaming data at any scale. On-Demand is a capacity mode for Kinesis Data Streams that automates capacity management, so that you never have to provision and manage the scaling of resources. In the On-Demand mode you pay for throughput consumed rather than for provisioned resources, making it easier to balance costs and performance.
Read More for the details.