top of page

Academic Secondment Grant: Developing and evaluating safe, trustworthy, and privacy-preserving systems in decentralised community interactions

  • 3 hours ago
  • 5 min read

Overview

The AIM Academic Secondment is designed to enable a SPRITE+ Expert Fellow to conduct a placement within and organisation to collaboratively work on a research challenge. The scheme operates through three phases.

  1. Potential AIM partners submit expressions of interest and if chosen they then work with the SPRITE+ director to set their research collaboration opportunity.

  2. SPRITE+ Expert Fellows and members express interest in that research collaboration opportunity by submitting an application.

  3. The successful Expert Fellow or member then works with the project partner to develop a full proposal which goes to the SPRITE+ management team for review.

Funding available: £10,000 for academic buyout time

Eligible applicants: UK-based academic researchers, including PhD students

Outcome: This project will advance applied research by developing and validating TIPPS mechanisms for trusted, privacy-preserving decentralised events.

Project topic: Developing and evaluating safe, trustworthy, and privacy-preserving systems in decentralised community interactions

For this AIM Academic Secondment, we invite applications from researchers with expertise in machine learning, data security, digital identity, human–computer interaction, responsible AI, or socio-technical systems to collaborate with Impromptu, a West London based community-building startup founded by Imperial College academics to instil ‘best-in-class’ data practices, ensuring safer, more trusted interactions to drive confidence in platform, social impact and adoption.


Background:

About Impromptu

Predicated on accessible, hyperlocal, decentralised events, Impromptu fosters connectivity within siloed communities while empowering community-led regeneration through Impromptu Plus, a dedicated portal which pivots its users around local businesses and institutions whilst furnishing quantitative metrics. This dual approach addresses societal challenges such as isolation - recognised by the WHO as a global health concern - while creating verifiable social impact that aligns with broader initiatives promoting participation, holistic development, economic revitalisation, and community cohesion.

By lowering barriers for local stakeholders, reflecting local needs, and building social capital through shared activity, Impromptu’s decentralised events serve as touchpoints for inclusion, safe spaces for connection, vehicles for skills, learning, and belonging, and drivers of local pride and empowerment. This generates credible, bottom-up social ESG data, directly supporting SPRITE+ priorities by embedding trusted, privacy-preserving digital infrastructure, enabling evidence-based measurement of social impact, and fostering responsible, community-focused innovation.


Background of the Project:

Impromptu is currently in a beta soft launch with key stakeholders across North Acton, Harlesden, and White City, including Imperial College, OPDC, The Republic of Park Royal, White City Warehouse, AWOL Residences, and numerous local startups and SMEs. The platform has evolved into a market-ready product delivering unquantified social impact, with core functionality such as UI, booking, ticketing, payments, QR check-ins, user profiles, feedback, filtering, and notifications, alongside extended features including optional KYC verification, verified-only events, female-only events, and dedicated business profiles. The full rollout of Impromptu Plus is planned for Q2 2026, expanding to local councils, cities, and wider networks within 12–18 months, with the goal of creating a connected digital ecosystem that empowers autonomous action while fostering sustainable, community-led social impact and potentially enabling integrated ESG reporting.

Developing Impromptu Plus requires a multidisciplinary team addressing marketplace safety, privacy-preserving identity, decentralised trust, and social impact data generation. Expert oversight from a data security scientist with policy and regulatory knowledge is essential to ensure GDPR-compliant implementation. Collaboration with a machine learning specialist experienced in content moderation and behavioural analysis will enable insights into in-app behavioural patterns and support the development of key functionalities such as hazard detection, trust scoring, and community safety mechanisms.


Research Focus:

The collaboration aims to implement safe, trustworthy, and privacy-preserving systems digital infrastructure, including:

  • AI/ML-driven hazard intelligence for detecting malicious actors, bots, and fraudulent events.

  • Trust scoring and verification mechanisms to measure organiser reliability and participant safety.

  • Privacy-preserving analytics to extract social impact insights from decentralised event data.

  • Scalable KYB/KYC identity verification workflows for community and business participants.

This creates a real-world testbed for applied research on decentralised trust, privacy-preserving AI, and regulatory-compliant digital ecosystems.

Objective: Evolving a Real-World Testbed for Digital TIPPS Research 

The implementation of these methodologies will allow the platform to serve as an applied research environment, with focus on (a) privacy-preserving AI at scale, (b) decentralised ML governance, (c) human-AI trust in Web3 environments, and (d) regulatory compliance in decentralised contexts. Key research questions posed by this work include (a) what social ESG insights can be extracted automatically from event data without manual reporting burden, (b) whether incentives are needed for event attendees to feedback social ESG data voluntarily, (c) whether privacy-preserving technologies enable meaningful social impact measurement, (d) how to balance granular impact measurement with user privacy and (e) if AI can identify "social washing" by analysing consistency between stated ESG commitments and actual event execution. 


Expected Applicant Background

Applicants should have a background in computer science with strong ML/NLP foundations and demonstrable experience designing and running evaluations of language models.

Required Experience (One or More)

  • A track record of research in relevant fields (e.g. machine learning, data security, digital identity, human–computer interaction, responsible AI, or socio-technical systems).

  • Experience translating research into applied or experimental settings, such as pilots, living labs, or real-world testbeds.

  • Ability to contribute to research outputs, evaluation frameworks, and knowledge exchange aligned with SPRITE+ objectives.

  • Expertise in digital Trust, Identity, Privacy, Protection and Security (TIPPS) as applied to data-driven or decentralised digital systems.

  • Experience with machine learning or data science methods relevant to platform safety, behavioural analysis, risk detection, or trust mechanisms.

  • Strong understanding of data protection, privacy, and regulatory compliance, including GDPR and related governance frameworks.

  • Experience working with real-world, user-generated or platform data, including the design of privacy-preserving and auditable data workflows.

  • Ability to contribute to applied, interdisciplinary research, bridging technical implementation, governance, and social outcomes.

  • Capacity to engage in collaborative research with industry and community partners, aligned with SPRITE+ values of impact, responsibility, and trust.

Desirable Experience

  • Safety research for generative models (robustness, harmful content, reliability, uncertainty calibration)

  • Human-in-the-loop evaluation and annotation studies (including guideline/rubric development.


Timeline

  • Call for applications sent out: 23rd February 2026

  • Applications deadline: 3rd April 2026

  • Interviews: Week commencing 13th April 2026

  • Applicants informed of outcome: 27th April 2026

  • Placement to start no later than 1st June 2026

  • Project to be completed by no later than 31st January 2027


Funding

Maximum total funding size is £10,000. The funding can be used towards academic buy-out time, as well as covering reasonable travel costs. Funds awarded will be subject to standard UKRI grant terms and conditions, which are non-negotiable.

The duration of work funded by this project will be no longer than 6 months commencing no later than 1st June 2026 and finish no later than 31st January 2027.

Eligible items for funding:

  • Replacement salary costs of the Expert Fellow

  • Costs of workshops/meetings

  • Travel and subsistence expenses


Application Process:

Application deadline: Friday 3rd April 2026

Download the call document:

Download the application form:


bottom of page