
Duality Technologies
Founded Year
2016Stage
Incubator/Accelerator - II | AliveTotal Raised
$50.05MLast Raised
$50K | 3 yrs agoMosaic Score The Mosaic Score is an algorithm that measures the overall financial health and market potential of private companies.
-6 points in the past 30 days
About Duality Technologies
Duality Technologies focuses on secure data collaboration in the technology domain. The company offers a platform that enables organizations to collaborate and analyze data securely without ever decrypting it. It includes services such as privacy-protected artificial intelligence development and encrypted queries on secured datasets. Primarily, it sells to the financial services industry, healthcare industry, government industry, and marketing industry. The company was founded in 2016 and is based in Hoboken, New Jersey.
Loading...
Duality Technologies's Product Videos


ESPs containing Duality Technologies
The ESP matrix leverages data and analyst insight to identify and rank leading companies in a given technology landscape.
The federated learning platforms market enables model training across multiple decentralized devices or data sources without centralizing sensitive data. These platforms allow organizations to develop AI models collaboratively while maintaining data privacy, security, and regulatory compliance. Key features include privacy-preserving training techniques, secure model aggregation, and integration w…
Duality Technologies named as Outperformer among 15 other companies, including Amazon, IBM, and Google.
Duality Technologies's Products & Differentiators
Zero Footprint Investigation
Loading...
Research containing Duality Technologies
Get data-driven expert analysis from the CB Insights Intelligence Unit.
CB Insights Intelligence Analysts have mentioned Duality Technologies in 1 CB Insights research brief, most recently on Sep 6, 2023.

Sep 6, 2023
The data security market mapExpert Collections containing Duality Technologies
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
Duality Technologies is included in 4 Expert Collections, including Cybersecurity.
Cybersecurity
11,252 items
These companies protect organizations from digital threats.
Digital ID In Fintech
268 items
For this analysis, we looked at digital ID companies working in or with near-term potential to work in fintech applications. Startups here are enabling fintech companies to verify government documents, authenticate with biometrics, and combat fraudulent logins.
AI 100 (All Winners 2018-2025)
100 items
Artificial Intelligence
10,195 items
Duality Technologies Patents
Duality Technologies has filed 25 patents.
The 3 most popular patent topics include:
- cryptography
- cryptographic attacks
- block ciphers

Application Date | Grant Date | Title | Related Topics | Status |
---|---|---|---|---|
10/13/2022 | 1/28/2025 | Cryptography, Cryptographic hardware, Disk encryption, Data management, Cryptographic attacks | Grant |
Application Date | 10/13/2022 |
---|---|
Grant Date | 1/28/2025 |
Title | |
Related Topics | Cryptography, Cryptographic hardware, Disk encryption, Data management, Cryptographic attacks |
Status | Grant |
Latest Duality Technologies News
May 22, 2025
Gary Drenik is a writer covering AI, analytics and innovation. Follow Author Share AdobeStock_313089358 According to Kurt Rohloff, CTO and co-founder of leading privacy-enhanced secure data collaboration software vendor Duality Technologies , the recent bipartisan move to ban the AI platform DeepSeek from U.S. government devices signals far more than just national security concerns—it’s a red flag for the broader trajectory of generative AI. “DeepSeek’s potential vulnerabilities are a symptom of a larger, more pressing issue with how society is trying to deploy generative AI,” Rohloff says. “The privacy architecture of most GenAI systems simply isn’t designed for the regulatory realities many sectors face.” In May 2025, Senators Bill Cassidy and Jacky Rosen introduced legislation to bar DeepSeek from federal contracts, citing concerns over the platform’s acknowledgment of routing user data to China. For Rohloff, however, this kind of reactive regulation only scratches the surface. “The foundational problem persists generative AI platforms bring serious risks of structural privacy flaws, and existing security measures aren't cutting it.” This unease is shared by consumers as well. A recent Prosper Insights & Analytics survey found that 58.6% of consumers are extremely or very concerned about their privacy being violated by AI using their data. Rohloff believes this sentiment is well-founded, especially in sectors like government, finance, and healthcare, where the stakes of data mishandling are existential. Prosper - How Concerned are You About Privacy Being Violated From AI Using Your Dats Prosper Insights & Analytics When AI doesn’t forget Generative AI models are fueled by data—immense volumes of it. Yet what makes them so powerful also makes them uniquely vulnerable. “These systems have the potential to consume everything you feed them: user prompts, documents, even behavioral cues,” Rohloff explains. “But unlike traditional software, they learn from and sometimes regurgitate that data. That creates a massive attack surface.” Many organizations, he notes, don’t fully understand what data their AI systems are ingesting. Without proper oversight, confidential or regulated information can unintentionally enter model training cycles or be exposed during inference. The result can be catastrophic, particularly in regulated sectors where data exposure carries legal, ethical, and financial consequences. “In critical sectors, even a minor lapse could mean leaked state secrets, manipulated financial trades, or breached patient records,” Rohloff warns. “And once trust is lost in these systems, it can take decades to rebuild.” He points to technical threats like model inversion attacks, where bad actors reconstruct training data by repeatedly querying models or prompt injections, where cleverly crafted inputs can override safety controls and extract restricted information. These aren’t theoretical issues; they’re live threats are already being tested. Policy is catching up—but slowly Recent government action is beginning to reflect the urgency. In January, the Biden administration issued Executive Order 14179, which aims to boost U.S. AI leadership while emphasizing the importance of secure development practices. In April, the Office of Management and Budget released memoranda directing agencies to establish standards around AI testing, monitoring, and the handling of personally identifiable information. “These are encouraging steps,” Rohloff says, “but we can’t audit or regulate our way out of flawed design. Privacy needs to be built in at the architecture level, not patched on after deployment.” Financial incentives also raise the stakes. The 2024 IBM Cost of a Data Breach report found the healthcare industry’s average breach cost to be $9.8 million, which is higher than any other sector. Rohloff sees that as an urgent reminder that the costs of under-secured AI are already tangible. A new paradigm: Privacy-Enhancing Technologies For Rohloff, the solution lies in a category of techniques known as Privacy-Enhancing Technologies, or PETs. Of these, Fully Homomorphic Encryption (FHE) stands out. “FHE lets us run computations on encrypted data without ever decrypting it,” he explains. “It flips the traditional model on its head. Data can remain protected at every step—at rest, in transit, and in use.” This innovation addresses a core vulnerability in current AI pipelines. Traditionally, sensitive data must be decrypted before an AI model can process it, leaving it briefly exposed. FHE eliminates that exposure altogether, making it possible to perform even complex machine learning operations without ever viewing the plaintext data. “The point is to make strong encryption usable in real-world AI deployments,” says Rohloff. “And we’re finally getting there.” His company, Duality Technologies, has helped drive this shift by developing tools that apply FHE to real-world applications like finance, healthcare, and cross-enterprise data collaboration. Open-source platforms like OpenFHE—evolved from earlier libraries like PALISADE—have also accelerated adoption by offering practical, performance-optimized implementations of multiple FHE schemes. Encryption, collaboration, and compliance The implications of FHE go beyond individual data protection. “It enables confidential collaboration across organizations,” Rohloff explains. “Think hospitals working together on patient analytics without ever revealing personal records. Or financial institutions conducting joint fraud detection without compromising proprietary data.” This capability is increasingly vital as AI workflows stretch across jurisdictions and regulatory frameworks. FHE helps organizations maintain compliance with laws like HIPAA, GDPR, and CCPA by ensuring that data is never processed in an unencrypted state. Equally important, FHE prevents the AI model itself from “learning” anything sensitive. “Even if the model is compromised,” Rohloff says, “it doesn’t have access to the actual data. That’s a game-changer for trust and resilience.” The role of leadership in secure AI For Rohloff, adopting FHE and other PETs isn’t just about technical hygiene. It’s a strategic imperative. “Waiting for regulations to force your hand is a losing strategy,” he warns. “Technical leaders must act now to secure AI’s future.” That means making PETs part of an organization’s AI strategy from the start, not as a compliance afterthought. It requires vetting tools for data handling risks, demanding encrypted-by-default architectures, and investing in secure development skills across technical teams. “We need a culture shift,” Rohloff says. “It’s not enough to trust the vendors. Leaders must ask hard questions, fund the research, and collaborate across sectors to set new norms.”
Duality Technologies Frequently Asked Questions (FAQ)
When was Duality Technologies founded?
Duality Technologies was founded in 2016.
Where is Duality Technologies's headquarters?
Duality Technologies's headquarters is located at 5 Marine View Plaza, Hoboken.
What is Duality Technologies's latest funding round?
Duality Technologies's latest funding round is Incubator/Accelerator - II.
How much did Duality Technologies raise?
Duality Technologies raised a total of $50.05M.
Who are the investors of Duality Technologies?
Investors of Duality Technologies include Defense TechConnect (DTC) Innovation Summit and Expo, Plug and Play Insurtech, Team8, Intel Capital, Hearst Ventures and 5 more.
Who are Duality Technologies's competitors?
Competitors of Duality Technologies include Roseman Labs, Bitfount, Omnisient, Rhino Federated Computing, Opaque and 7 more.
What products does Duality Technologies offer?
Duality Technologies's products include Zero Footprint Investigation and 1 more.
Who are Duality Technologies's customers?
Customers of Duality Technologies include Dana Farber.
Loading...
Compare Duality Technologies to Competitors

Enveil is a privacy-enhancing technology company that focuses on protecting data in use across various sectors. The company offers ZeroReveal solutions that enable secure data usage, collaboration, and monetization without compromising data privacy or security. Enveil's products are designed to allow organizations to extract insights and analyze data across boundaries and silos while maintaining the confidentiality and ownership of the underlying data. It was founded in 2016 and is based in Fulton, Maryland.

Apheris provides access to distributed data for machine learning and analytics within the technology and healthcare sectors. The company offers the Apheris Compute Gateway, a solution that enables federated machine learning and analytics without the need to centralize sensitive data. Apheris serves sectors that require data compliance and security, such as the pharmaceutical and healthcare industries. It was founded in 2019 and is based in Berlin, Germany.

Decentriq provides data clean rooms within the technology and data collaboration sectors. The company offers a platform for businesses to collaborate on sensitive data without sharing the actual data, using privacy technologies to ensure compliance. Decentriq serves industries such as media, healthcare, banking, and the public sector. It was founded in 2019 and is based in Zurich, Switzerland.
Omnisient specializes in data collaboration within the financial services and consumer brands sectors. The company provides a platform for businesses to share and monetize information while ensuring consumer privacy and data protection. Omnisient serves sectors including retail, telecommunications, healthcare, banking, insurance, and credit bureaus. It was founded in 2019 and is based in Cape Town, South Africa.
Roseman Labs provides data analysis solutions in the healthcare, security, and defense sectors. The company offers services that enable the encryption, linking, and analysis of sensitive data, while adhering to regulations such as GDPR. Roseman Labs serves sectors that require data privacy measures, including healthcare research and cybersecurity. It was founded in 2020 and is based in Utrecht, Netherlands.

Zama operates as an open-source cryptography company that specializes in fully homomorphic encryption (FHE) solutions for the blockchain and AI sectors. The company offers products that enable computation on encrypted data, allowing for privacy-preserving machine learning and confidential smart contracts. Zama's solutions cater to industries that require data privacy and security, such as finance, healthcare, and identity verification. It was founded in 2019 and is based in Paris, France.
Loading...