A.L.B.E.R.T (the awareness lab's big enlightenment realization technology) is an experimental work and a functioning prototype by the awareness lab for atentional and contemplative training. It demonstrates a speculative future of immersive Artificial Intelligence assisted learning, multi-modal biofeedback, real-time algorithmic cognitive assessment, generative audio / visual environments, and partiipant specific learning.

The experience is narrative, experiential, generative, and responsive to the participants biorhythms determining attention. The environment, UI, mental maps, and guided feedback provided by an artificial intelligence Large language model trained on secular mindfulness, all modify based on targets defined the lab and the hardware.

This emergent and advanced technology may be crude compared to the elegance of thousands of years of tradition, tools, and technique from qualified teachers. Yet the ambition is the same; help someone discover themselves, train their attention, and explore their ontology through instruction, interpretation, and feedback. this platform (and others emerging like it) UTILIZE A media environment WHICH is generating itself based on the internal and external states of the participant, thereby potentially illuminating unique for growth in the field of mindfulness or other contemplative modalities.
Team:
Jesse Reding Fleming: director
Shane Bolan: Technical Director + Game Engine Developer
Trystan Nord: Game Engine Developer + sound design
Max Urbany: machine learning + artificial intelligence design and development
Jay Kreimer: Composer / Sound Design
Maital Neta: cognitive scientist
Mike Dodd: cognitive scientist
HP Educause / HP Omnicept: Equipment Sponsor
Funded by the university of Nebraska's Office of research and economic development (ORED) and the Awareness Lab