Platforms application level, support for Wi-Fi and Thread as communication mediums and backing from major tech companies set it apart from other protocols like Zigbee. Although the number of Matter-enabled devices is still relatively low, its growing importance in the IoT space makes it worth assessing for those looking to build smart home and IoT solutions. 39. Modal Assess Modal is a platform as a service (PaaS) that offers on-demand compute without the need for your own infrastructure. Modal lets you deploy machine learning models, massively parallel compute jobs, task queues and web apps. It provides a container abstraction that makes the switch from local to cloud deployment seamless, with hot reload both locally and in the cloud. It even removes deployments automatically, avoiding the need for manual clean-up, but can also make them persistent. Modal is written by the same team that developed the first recommendation engine for Spotify. It takes care of the AI/ML stack end-to-end and can provide on-demand GPU resources, which is useful if you have particularly intensive computational needs. Whether you’re working on your laptop or in the cloud, Modal just works, providing an easy and efficient way to run and deploy your projects. 40. Neon Assess Neon is an open-source alternative to AWS Aurora PostgreSQL. Cloud-native analytical databases have embraced the technique of separating storage from compute nodes to elastically scale on demand. However, it’s difficult to do the same in a transactional database. Neon achieves this with its new multi-tenant storage engine for PostgreSQL. With minimal changes to the mainstream PostgreSQL code, Neon leverages AWS S3 for long-term data storage and elastically scales the processing up or down (including scale-to-zero) for compute. This architecture has several benefits — including cheap and fast clones, copy-on-write and branching. We’re quite excited to see new innovations on top of PostgreSQL. Our teams are evaluating Neon, and we recommend you assess it as well. 41. OpenLineage Assess OpenLineage is an open standard for lineage metadata collection for data pipelines, designed to instrument jobs as they’re running. It defines a generic model of run, job and data set entities using consistent naming conventions. The core lineage model is extensible by defining specific facets to enrich those entities. OpenLineage solves the interoperability problem between producers and consumers of lineage data who otherwise would need to know how to speak to each other in various ways. Although there is a risk of it being another “standard in the middle,” being a Linux Foundation AI & Data Foundation project increases its chances of gaining widespread adoption. OpenLineage currently supports data collection for multiple platforms, such as Spark, Airflow and dbt, although users need to configure its listeners. Support for OpenLineage data consumers is more limited at this time. © Thoughtworks, Inc. All Rights Reserved. 25
Immersive Experience — Vol 28 | Thoughtworks Technology Radar Page 24 Page 26