Over the last few years, a slew of solutions have entered the market that allows organizations to work with their own data and their partners without exposing it recklessly: clean rooms, trusted research environments, and tokenization. These advances support a shift toward collaborative intelligence, acknowledging that limiting data sources to what is available in-house inherently results in less representative and incomplete data.
No single institution possesses all the data it needs. This poses a huge opportunity for highly-regulated industries like life sciences and financial institutions to work collaboratively. Diversifying datasets with partners is the rising tide that lifts all boats. Working with collaborators in an ‘always on’ network to share insights in real-time (e.g., cybersecurity threats or fraud signals) is equally valuable.
With so much value waiting to be tapped into, how can companies work together while abiding by regulatory guardrails and infosec concerns, and ensure data remains secure?
The answer? Federated Computing.
In an exclusive interview, we spoke to Chris Laws, Chief Commercial Officer at Rhino Federated Computing, about why a centralized data model is no longer sustainable and how federation connects siloed data within institutions.
Federated computing
Federated Computing connects disparate data sources without exposing the raw data itself. Only the insights are shared back to users, while sensitive data and intellectual property (IP) are preserved.
Rhino, founded in 2021, works with organizations like Eli Lilly and its TuneLab program, the Cancer AI Alliance (CAIA) – which is comprised of Dana Farber Cancer Institute, Fred Hutch Cancer Centre, Memorial Sloan Kettering Cancer Centre, Johns Hopkins Medicine, and Johns Hopkins Whiting School of Engineering – as well as the Society for Worldwide Interbank Financial Telecommunication (SWIFT), which wants to analyze data from many disparate sources for research and AI model training.
Federated Learning (FL) has been around for a decade. FL trains AI directly at the data source, eliminating the need for centralized data collection. Google started using FL to train predictive typing on phones in 2017. Data scientists have since adopted the technology, and it is emerging as a tool that addresses real business problems. For example:
- The FAITE Consortium, a consortium for biologics property prediction via federated and active learning.
- NVIDIA Merlin framework, which, with Toshiba Tec and McKinsey, is turning retail data into real-time decisions.
- MELLODDY, a group developing predictive models for drug discovery through the Machine Learning Ledger Orchestration.
Rhino built the Rhino Federated Computing Platform (Rhino FCP) to solve many of the challenges facing users of open-source FL frameworks by offering a production-ready system that addresses security and privacy controls. It integrates into organizations’ tech stacks, meaning it is ready to solve real business problems.
“Increasingly, AI and data science leaders are being asked by business sponsors, ‘How do I solve a business problem that requires cross-silo collaboration in such a way that Legal, InfoSec, and Compliance functions are comfortable?’ – often without knowing the vocabulary of federated computing,” Chris said.
Rhino FCP is the answer to these questions. It’s not a piece of software seeking internal support to justify its existence, nor is it a new solution looking for problems to solve.
A platform built on local control
Information does not need to be centralized or collated for organizations to work together. Data lakes are often static and can rely on replication. Chris describes Rhino FCP as a platform that exists on top of, or alongside, existing systems.
“As a rule, we like to play nicely with other technologies,” he said. For example, if an organization has spent time and money building a pipeline in Azure and needs to integrate a supplier’s data, Rhino FCP provides a secure container for analysis and collaboration without exposing either party’s underlying data.
In many environments, problems emerge when one company or business unit knows that its data doesn’t have the same schema as another’s, or when information has to be collated intelligently while following strict privacy and security regulations. This is a roadblock. Rhino built the Data Harmonization Engine to handle exactly these types of issues.
It transforms one data model into another, allowing different parties to work with a consistent schema without either or both parties bearing the task of normalizing, sanitizing, or anonymizing datasets.
Chris says the platform is designed to be compatible with any dataset. “We’ve intentionally built a platform that’s as flexible as possible so it doesn’t matter what the underlying data look like,” he said. Whether it’s structured tables, images, video, or waveform data, it can be federated. This means targeted business outcomes and problem-solving can be achieved using disparate data sources.
Companies can even bring their own applications to the platform as containers and have them work with data. Technical team members exploring their options can scan Rhino’s documentation library, where they can continue using SQL-based queries, Python, NVIDIA FLARE, or Rhino’s Generalized Compute Code via the SDK.
Whether dealing with existing algorithmic workloads or collating data sources for AI model training, Rhino FCP’s data federation technologies solve the fundamental data layer problem that traditional architecture imposes: the requirement to physically move or centralize sensitive data before it can be analyzed.
Practical and realistic
Laws is practical and realistic in his assertions, noting that the Rhino FCP platform and the Data Harmonization Engine gets his customers “90% of the way there” to a unified data resource – a little customization is inevitable. This contrasts with bold claims from Rhino’s competitors that a black box large language model can ‘handle it all’.
This measured approach is born out of the company’s background in highly-regulated industries like life sciences, healthcare, and the public sector, where intellectual property must strictly adhere to regulatory guardrails. The company’s platform is therefore appealing to a new group of customers from various sectors, including energy, financial services, automotive, and agriculture.
By presenting data in a collaborative space where normalization and federation are built in, companies can actually put their data to work and receive insights that address the thorny issues an organization wants to crack.
For companies looking for a way to exceed the limits of centralized data while keeping their information safe, we recommend attending the upcoming TechEx Edge Computing North America Expo in San Jose, May 18-19, to meet and talk with Chris at booth #269. If what you’ve read here has made you curious to find out more, head over to Rhino’s website.
(Image source: Sources, Pixabay (header image) under licence, RhinoFCP (interviewee))
Want to learn more about IoT from industry leaders? Check out IoT Tech Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and co-located with other leading technology events. Click here for more information.
IoT News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.



