Equiano Institute Alignment
AI alignment is a field of research that seeks to ensure that artificial intelligence systems work in the interests of humans.
Register for our workshops on alignment workshop which is a flexible introduction to AI Safety ideas.
This presents the technical AI Safety tools required to help individuals interested in safe-guarding, understanding and steering AI systems
⎿ This workshops are organised by Benjamin Sturgeon