#Alignment is the process of incorporating human feedback when training a model.
09 Feb 2023,   16:03
#Alignment is the process of incorporating human feedback when training a model. 
 
✅An #AI system that is aligned works toward the goal it was designed to achieve 
 
❌An AI system that is misaligned is competent at advancing some goals but not the one it was designed to achieve.