5. Synthesize the market forecasts with machine learning
Prediction markets fill the gap when there isn’t sufficient historical data to rely on machine learning alone. Most often, there is some data to inform your strategic decisions, but not enough to determine the right choice. For example, perhaps you have ample time-series data on your consumer spending behavior on one platform, but nothing yet on a new platform you have yet to launch.
There are two ways to synthesize prediction markets and machine learning. First, use the output of the prediction market as input into a larger predictive analytics system that also draws from machine-learned inference on time series data. Alternatively, add machine learning systems as market makers, providing baseline bid-ask spreads and liquidity that increases the incentive for accurate predictions.
Architecting the Platform on Google Cloud
By leveraging Google Cloud Platform (GCP), we were able to build the entire platform with a small team in less than a year and scale to thousands of users. We’d like to share the specific GCP tools we used to prototype, build, and grow a robust platform very quickly:
Serverless provided the main workhorse for the platform. We use App Engine, and are also experimenting with Cloud Run and Cloud Functions to implement the trading backend.
Serverless is a great toolkit for building an application before you decide what type of backends you might want to manage yourself on Compute Engine. It handles deployments, versioning, and resource scaling without writing additional code. This lets us stay flexible with the backend architecture.
Cloud Scheduler and Cloud Tasks complement a serverless architecture well, making it easy to run periodic jobs, like providing badges to our top predictors, and long-running background tasks, like market resolutions and calculating analytics.
Cloud Firestore served as our application database, running in Datastore mode. As a scalable NoSQL cloud database, it lets you develop your frontend flexibly, updating your schema as you design and build more features. It also supports complex, hierarchical data – in our case, storing bids, asks, and traded positions as children of the market entity makes querying for data fast and easy.
Using Data Studio, you can dashboard insights from the predictions, for example to visualize trends across markets over time. With Looker, you can aggregate the market predictions with existing predictive analytics to produce better insights than either would produce alone.
It also supports an easy export to BigQuery for analytics and BigQueryML for Machine Learning, and to refine the human consensus algorithmically. Once you have this data, GCP’s Vertex AI offering can allow you to train forecasting models with little to no code, giving the best synthesis of human and machine reasoning on difficult strategic questions.
Dataflow enables us to do large batch operations on the data store – such as restoring from a backup – as a managed process.
We also use a variety of basic GCP tools for the operational side of running our prediction market. We used:
Logs Explorer to debug tricky situations, such as when queries time out or we hit an issue with an internal API
Cloud Profiler and Cloud Trace tools helped us identify scaling bottlenecks early, such as unnecessary synchronous Cloud Firestore queries that could be done asynchronously.
Cloud Error Reporting made it easy to configure monitoring for errors, jump to the underlying issues, and determine which errors were more more frequent
Get Started
Ready to take your predictive analytics to the next level and unlock the wisdom of your organization? If you’d like to discuss building a platform like this with us, please provide your information in this form.