Sal Khan’s Post

View profile for Sal Khan, graphic
Sal Khan Sal Khan is an Influencer

Khan Academy, Schoolhouse.world, Khan Lab School, Khan Lab High School

A lot of school district leaders have been asking what they should think about when using AI tools. Here is a list of some things to consider: Must-haves: -For under 13 (possibly 18) users- teachers and/or administrators should have transparency into how students are using the tools (i.e., have access to student conversation/transcripts and can get summaries of student AI activity) -For under 18 users - there must be clear moderation mechanisms that keep students from engaging in negative use cases (self-harm, hate, etc.). These mechanisms must proactively notify key stakeholders (e.g., teachers and administrators) when the system detects these situations. -AI vendor must use high quality underling models that minimize errors and hallucinations (today this is GPT-4, Gemini Pro 1.5, Anthropic Claude 3 or better). GPT3.5 is not acceptable for student/teacher use.  -AI vendor must have clear contracts with AI model creators stipulating that student data cannot be used for training of the general, public models -AI vendor must be clear that student/educator data will not be sold to third parties in ANY circumstance (even in the event of bankruptcy or an acquisition) -AI vendor must-have clean SOC-2 audits - this ensures that student and educator data is protected/secure Nice-to-haves: -AI vendor has taken care to prevent use of AI for cheating. -AI vendor has added layers to the base model to minimize errors/hallucinations -AI vendor has evaluation/benchmarking mechanisms to measure the AI error rate -AI vendor has school/district/system level reporting so that AI use can be monitored centrally -AI vendor has professional development for teachers to understand how to deploy in the classroom -AI vendor has training for students to understand how to use the AI tools (and how to mitigate risks) -AI tool supports multiple languages -AI tool can be "one-stop" across grades and subject matter to avoid fragmentation of the experience (and the district having to manage multiple solutions).

Hemant Yadav

Growth @ Branch | Deep links & Attribution

2mo
Huda Baig

Product Manager I Strategist I Entrepreneur I Startup Advisor

2mo
Jason Katcher

VP Strategic Channel AI Partnerships @ Merlyn Mind, Ex-Google & Dropbox Education

2mo

Merlyn Mind is FERPA/COPPA compliant and SOC-2 audited. https://trust.merlyn.org/. Know what your AI solution is doing. Are they building on AI or building AI? You owe it to your teachers and students to be responsible stewards of AI.

Mark Strauss

Global Key Account Manager at Multi-Color Corporation

2mo

So we can assume that the answer to the question “do we need it at all in our schools”, is yes?

Akash Gupta

Quality Engineer at Barclays

2mo

I like the list of AI vendor of what khan academy can do for the school system. That’s an excellent idea.

Jaya Kandaswamy

SVP, Product/Innovation | AI Product Strategy | Mentor, Startup Advisor

2mo

The issue with training data related to kids is a big one to solve for. How do we keep it separate without on-prem implementations? What are the implications there? How do we ensure data is handled appropriately? Age old question.

CLIFFORD TOMB

Author, English Coach, Business & Healthcare Analyst & Consultant

2mo

#TheWrathofKHAN!

  • No alternative text description for this image
Shashwat Chandra

Principal Architect at Microsoft

2mo

You should also consider: Teacher-Student Interaction: AI should enhance the learning experience by supporting teachers and students, rather than substituting for their interactions. Current School Scoring Methods: AI should not supplant existing student assessment methods; instead, it should work alongside them to provide valuable insights.

Thanks for these ideas and guidelines for making AI safer and more relevant in education settings!

The low hanging fruit is to use AI as teaching assistant, with the clear aim of: 1) raising teacher competence (scalable continuous professional development), 2) helping assess, track and monitor student progress, under similar security and privacy settings to non-AI tools, and 3) reducing admin overheads and improving productivity. Students especially in the early education years need not be exposed to AI tools directly (from Day 1) without proper safeguards already in place, for there are many important issues we must resolve first, as exemplified in this post. Let’s learn to walk first before we attempt to run. PS. College/University level education is a different bowl game altogether as students there are essentially adults who should be able (/allowed) to make their own decisions. The bigger challenge there IMO is the relationship between human teachers and AI. Get that right and we should be looking at accelerated learning pathways as the norm.

See more comments

To view or add a comment, sign in

Explore topics