Henoyo.Here. Now. Yours.
GDPR

AI in your own cloud, under your own GDPR controls.

The first GDPR question is usually not about the model. It is about the boundary. Where does the data live? Who is the controller? Who is the processor? What new transfer risk did we just create by turning on AI? Henoyo is built to make those questions easier to answer, not harder. The software deploys into your own AWS or Azure account, under your IAM, inside your security boundary. There is no shared customer infrastructure and no standing access for Henoyo to your data. Your Salesforce tokens live in your own secrets store. Your logs stay in your environment. Your existing regional choices for data residency, retention, and access control can continue to apply.

At a glance

  • Henoyo runs as processor under Article 28, in your own AWS or Azure region
  • No new transfer mechanism — data never leaves your perimeter
  • Immutable audit trail supports DSARs and Article 35 DPIAs
  • DPA available on request, typically within 2 business days

Controller, processor, and Article 28.

Under GDPR, role clarity comes first. In most deployments, you remain controller because you determine the purposes of the processing. Henoyo acts as processor to the extent it processes personal data on your documented instructions, with the software running inside your own cloud environment. That maps cleanly to Article 28, where controllers must use processors that provide sufficient guarantees around technical and organisational measures.

The practical advantage is that Henoyo is not asking you to move your customer data into a new multi-tenant SaaS perimeter just to use AI. The runtime sits in your AWS or Azure account. Access is governed by your IAM. Secrets stay in your own AWS Secrets Manager or Azure Key Vault. That does not remove your Article 28 obligations, but it does make them easier to document in a DPA, easier to explain in procurement, and easier to defend in a privacy review.

Lawful basis still belongs to the customer.

Henoyo does not create a new lawful basis for processing. You still need one. If you are using AI to support customer service, internal operations, fraud prevention, or contract performance, the lawful-basis analysis remains yours as controller. If you are relying on consent for a specific use case, that consent still needs to be captured, recorded, and respected in the underlying business process.

What changes with Henoyo is the control surface. Prompts, Skills, Data Context Mappings, Agents, and Audit give you a governed way to decide what data is exposed to a use case, what gets masked, what gets logged, and who can invoke it. That matters for Article 5 principles like purpose limitation and data minimisation. A Prompt should not see more than it needs. A Skill should not be callable by everyone. A Data Context Mapping should pin the exact fields in scope.

Data residency, international transfers, and Schrems II.

A lot of GDPR anxiety around AI is really transfer anxiety. If personal data leaves the EEA, what is the transfer mechanism? Are SCCs needed? What supplementary measures exist? How does this hold up after Schrems II? Henoyo reduces the number of moving parts in that conversation. The software runs in the region and cloud account you choose. If you already operate in an EU AWS or Azure region, Henoyo can run there too. Your logs, tokenization store, and application runtime stay in that environment.

That does not make transfer analysis disappear. If you choose a model provider or sub-processor that involves a transfer, you still need to assess that path and put the right contractual and technical measures in place. The architecture helps because you are not automatically creating a second copy of your operational data in a vendor-controlled environment. Structured tokenization, field-level masking, and customer-controlled deployment are all relevant supplementary measures when privacy teams assess cross-border risk.

DSARs, erasure, and evidence.

GDPR compliance is not only about where data sits. It is also about what happens when a person exercises rights. Access, deletion, correction, restriction, and portability requests do not go away because AI is involved. Most companies need to answer a simpler question first: did the AI system process this person's data, and if so, where is the evidence?

This is where the audit model matters. Henoyo writes an immutable audit trail with six-year retention by default, exportable to CSV, S3, or Splunk. That gives privacy, security, and legal teams a record of what Prompt or Skill ran, through which channel, and under what controls. For right-to-erasure workflows, the important point is that Henoyo is designed to avoid creating unnecessary shadow copies. If the source record is deleted or updated in your system of record, the AI layer does not need to become a second long-term repository of personal data.

Article 22 and human oversight.

Article 22 is where many AI projects get over-interpreted. Not every AI-assisted workflow is solely automated decision-making with legal or similarly significant effects. But some use cases can move in that direction, especially if a company starts using AI to score, rank, approve, deny, or route people in ways that materially affect them. That is where governance matters.

Henoyo is designed to support human-in-the-loop deployment patterns. Agents can surface recommendations, summaries, and next steps without forcing full automation. Skills can be permissioned. Audit records can show who invoked what and when. That does not by itself answer every Article 22 question, but it gives you the primitives needed to keep meaningful human review in the process and to document that review in a DPIA under Article 35 where required.

RequirementHow Henoyo addresses it
Article 28 — processor obligationsCustomer-cloud deployment, customer IAM, DPA on request
Article 32 — security of processingField-level masking (AWS Comprehend / Azure Presidio), structured tokenization with 15-min TTL, prompt-injection defenses, immutable audit
Article 35 — DPIA supportClear system boundaries, logging, configurable retention
Article 22 — automated decisionsHuman-in-the-loop primitives, permissioned Skills, audit trail
International transfersRegional deployment in your own cloud plus customer choice over model providers and transfer mechanisms
DSAR and erasureAvoiding unnecessary copies, keeping evidence in audit trail

GDPR requirement to product mechanism.

Article 28 processor controls map to customer-cloud deployment, customer IAM, and DPA support. Article 32 security of processing maps to field-level masking with AWS Comprehend or Microsoft Presidio, structured tokenization with a 15-minute TTL in DynamoDB or Cosmos DB, prompt-injection defenses, and immutable audit. Article 35 DPIA work maps to clear system boundaries, logging, and the ability to show exactly which fields are exposed through Data Context Mappings. Article 22 concerns map to human oversight patterns, permissioned Skills, and audit evidence. International transfer concerns map to regional deployment in your own cloud plus customer choice over model providers and transfer mechanisms. DSAR and erasure workflows map to avoiding unnecessary copies and keeping evidence in the audit trail.

Sub-processors

Sub-processorPurposeRegion
AWSCompute, storage (AWS deploys)Customer's
AzureCompute, storage (Azure deploys)Customer's
AnthropicOptional (Claude)Customer's
OpenAIOptional (GPT-4o)Customer's
DeepgramVoice speech-to-text (only if voice channel enabled)US
SalesforceSource system (customer's own instance)Customer's

Request more detail

Need a specific compliance questionnaire, a signed DPA, or a technical architecture review? We typically respond within one business day.