KTP Outreach Pack
KTP Partner Outreach — RAIDT
Purpose: secure a UK industry partner to co-fund a Knowledge Transfer Partnership (KTP) with the University of Portsmouth, embedding a researcher to deploy RAIDT in their organisation.
Mechanism (one-liner for partners): Innovate UK funds 50–67% of project costs; the partner contributes the rest plus the salary of a graduate KTP Associate (~£40–55k/yr) embedded for 18–36 months. Academic supervision included.
Target list — prioritised
A. Audit & assurance firms (highest revenue ceiling)
- KPMG UK — Trusted AI / Responsible Tech practice
- Deloitte UK — Trustworthy AI Institute, Audit & Assurance
- PwC UK — Responsible AI practice
- EY UK — Trusted AI / AI Confidence
- BDO, Mazars, Grant Thornton — mid-tier alternatives, more accessible decision-makers
B. Regulated sector anchor customers
- NHS Trust with active GenAI deployment (e.g., Great Ormond Street, Imperial, UCLH)
- A retail/challenger bank (e.g., NatWest, Lloyds, Starling, Monzo) — risk function
- A UK insurer with AI claims/underwriting (Aviva, Direct Line)
- A law firm with AI tooling (DLA Piper, Linklaters, Allen & Overy)
- A defence supplier (BAE Systems, QinetiQ, Roke) — sovereign assurance angle
C. AI assurance vendors / RegTech
- Holistic AI, Credo AI (UK), Trilateral Research, Faculty AI, PRGX, Logically AI
- Smaller vendors are easier to land but smaller ceiling
D. Public sector
- DSIT AI Directorate
- AISI (UK AI Security Institute)
- Cabinet Office GDS
- NHS Transformation Directorate
Email Template — Version A (Audit Firms)
Subject: RAIDT — a peer-reviewed evidence framework for your GenAI assurance practice
Dear [First Name],
The EU AI Act, ISO/IEC 42001, and the FCA's GenAI scrutiny are pushing regulated organisations to evidence — not just claim — that their AI use is governed, auditable, and contestable. The methodologies most firms have today rely on principles and model cards. When a dispute arises, that evidence is rarely sufficient.
I lead the development of RAIDT at the University of Portsmouth — a peer-reviewed run-level evidence framework that captures, scores, and audits individual GenAI runs against five governance dimensions (Responsibility, Auditability, Interpretability, Dependability, Traceability), with explicit mappings to the EU AI Act, ISO 42001, and the NIST AI RMF. The framework is published across a trilogy of academic papers and is being trialled in healthcare, financial services, and public-sector settings.
I'd like to explore a Knowledge Transfer Partnership (KTP) with [Firm Name] to embed RAIDT into your AI assurance methodology. KTPs are an Innovate UK scheme: a graduate KTP Associate works full-time at [Firm Name] for 18–36 months under joint supervision, with Innovate UK funding 50–67% of total project cost. For your firm, this typically means £40–80k/yr cash contribution for a complete methodology, training programme, and certification scheme — not a research project, but a productised audit methodology you can deploy on engagements.
I attach a one-page brief and the published Foundations paper for your review. Could we find 30 minutes in the next two weeks for a call?
Kind regards,
Mohammad Ali Akeel
PhD Researcher, School of Organisations, Systems and People
University of Portsmouth
mohammad.akeel@myport.ac.uk
Email Template — Version B (Regulated Sector Risk/Compliance Lead)
Subject: GenAI evidence packs for [Sector] — University of Portsmouth KTP partnership
Dear [First Name],
[Organisation] is using generative AI in [specific use case — clinical decision support / credit explanation / claims triage / contract review]. Under the EU AI Act, your sector regulator [MHRA / FCA / SRA / PRA / ICO] increasingly expects organisations to evidence what happened in a given AI-assisted decision — prompt, model configuration, retrieval context, human review, and the basis for the final action.
My research at the University of Portsmouth has produced RAIDT, a run-level evidence framework that gives your team a structured way to capture this evidence and score governance readiness on a 1–5 scale across five dimensions. RAIDT is mapped to ISO/IEC 42001, the EU AI Act, and the NIST AI RMF.
I'd like to propose a Knowledge Transfer Partnership between [Organisation] and the University of Portsmouth. A KTP places a full-time graduate associate inside your team for 18–36 months to operationalise RAIDT against your specific GenAI deployments. Innovate UK funds 50–67% of project costs; your contribution covers part of the associate's salary and project costs (typically £40–60k/yr for a SME, somewhat higher for larger organisations).
Concretely, after the KTP, you would have:
- a deployed RAIDT evidence pipeline for your GenAI use cases
- audit-ready evidence packs for [sector regulator]
- a trained internal capability and documented playbook
- a peer-reviewed academic foundation for your AI governance position
Could I send a one-page brief and find 20 minutes for a short call?
Kind regards,
Mohammad Ali Akeel
PhD Researcher, University of Portsmouth
mohammad.akeel@myport.ac.uk
Email Template — Version C (AI Assurance Vendor / RegTech)
Subject: RAIDT — academic foundations to harden your AI governance product
Dear [First Name],
[Vendor] is selling AI governance/assurance tooling into a market that increasingly asks "where is the academic backing for your methodology?" — particularly under EU AI Act audit and ISO 42001 certification.
I lead development of RAIDT at the University of Portsmouth: a published, peer-reviewed run-level evidence framework with explicit standards mappings (EU AI Act, ISO 42001, NIST AI RMF) and sector playbooks. RAIDT is methodology, not software — which is exactly what a tooling vendor needs as a defensible foundation.
A Knowledge Transfer Partnership between [Vendor] and Portsmouth could integrate RAIDT into your product as: (a) the schema for evidence capture, (b) the scoring engine, (c) the standards mapping layer, and (d) the certification methodology. Innovate UK funds 50–67% of total project cost. For [Vendor], this is academic credibility, methodology IP, and product depth at SME-affordable rates.
Is this a 30-minute conversation worth having? Happy to share the published papers and a product-fit brief in advance.
Kind regards,
Mohammad Ali Akeel
PhD Researcher, University of Portsmouth
mohammad.akeel@myport.ac.uk
One-page brief — attach to first email
RAIDT in one page
- What it is: a run-level evidence framework for governing generative AI in organisations.
- Who it's for: organisations using GenAI in regulated or high-stakes work, and the firms that audit, certify, or supply governance tooling to them.
- What it produces:
- A run-level evidence pack — bounded record of one GenAI run (prompt, model, retrieval, output, human review, final use)
- A 5-pillar scoring profile — governance readiness scored 1–5 on Responsibility, Auditability, Interpretability, Dependability, Traceability
- What it is mapped to: EU AI Act (Articles on transparency, oversight, logging, risk management); ISO/IEC 42001 (AI management systems); NIST AI RMF + GenAI Profile.
- Validation: peer-reviewed trilogy of papers (Foundations, Empirical Validation via influence methods, Interoperable Governance pathways); design-science methodology.
- Differentiator: prevailing approaches govern at model or policy level. RAIDT governs at run level — the actual unit at which evidence, contestability, and accountability live.
- Lead: Mohammad Ali Akeel (PhD, Portsmouth); Supervisors: Prof. Mark Xu, Dr Awais Shakir Goraya, Dr Salem Chakhar.
Process notes for you
- Personalise every email. Add one specific signal that you researched the recipient — a recent talk, paper, product launch, or LinkedIn post. Templates without personalisation get binned.
- Send Tuesday–Thursday, 7–9am. Highest open rates for senior recipients.
- Track in a simple sheet: Name | Org | Sent date | Reply | Meeting | Stage. Keep this.
- Follow up once after 8 working days if no reply — one short bump, then drop.
- First-meeting goal is NOT to close. Goal is to (a) confirm fit, (b) identify the internal sponsor, (c) get a follow-up with technical people. KTP applications take 8–12 weeks to shape — start now.
- Portsmouth's KTP Office can co-pitch: bring them into the second meeting once interest is confirmed. They handle the application paperwork.