Abstract
This article presents an integrated analysis of educational needs and strategic actions for building an ecosystem capable of addressing the ethical, pedagogical, and cultural challenges posed by Artificial Intelligence (AI). Based on the Operational Plan of the Territorial Training Team (EFT) of Lombardy, it proposes a comprehensive training design for teachers and families aimed at fostering responsible and informed digital citizenship.
1. The Urgency of a New Literacy
The pervasive rise of Artificial Intelligence—from generative systems to automated decision-making—requires a deep reflection on the educational role of schools and the shared responsibility of families. AI is not merely a technology; it is a cognitive and cultural environment that reshapes our relationship with knowledge, with others, and with ourselves.
Research conducted by the Lombardy EFT highlights a complex picture: widespread curiosity and enthusiasm among teachers and school leaders, but also a lack of structured policies, ethical frameworks, and critical awareness. Families, meanwhile, experience both fascination and concern—fascination for the creativity and efficiency promised by AI tools, and fear of a school increasingly mediated by algorithms. In this context, integrated school–family training becomes a strategic pillar for cultivating mature and responsible digital citizenship.
2. Emerging Needs: Between Literacy and Governance
The University of Bergamo’s guidelines identify four foundational principles—Responsibility, Transparency, Safety, and Inclusion—which align closely with those of the Lombardy EFT. Both frameworks highlight convergent needs:
- AI Literacy: understanding how algorithms work; distinguishing between automation and intelligence; recognizing risks such as bias, hallucination, and plagiarism.
• AI Governance in Schools: building local policies grounded in ethics, privacy, and human oversight, consistent with the EU AI Act and the GDPR.
• Critical and Pedagogical Competence: integrating AI into daily teaching practices while preserving the authenticity of human thought.
• Community Participation: involving parents and students in transparent dialogue about digital tools, reducing misinformation and fear.
• Equity and Accessibility: ensuring that AI becomes a driver of inclusion and personalization, not exclusion.
3. Towards an Integrated Training Model
The proposed intervention consists of three complementary levels.
a. Teacher Training – “AI in Educational Practice”
A modular program (hybrid synchronous/asynchronous) including:
• Ethics and Policy: AI Act principles, privacy by design, professional accountability.
• Augmented Teaching: conscious use of ChatGPT, Gemini, Copilot, and generative tools for image and video creation.
• Assessment and Authenticity: strategies for detecting AI-generated work and fostering critical validation.
• Disciplinary Labs: practical applications in mathematics, languages, history, and science, with a focus on accessibility and inclusion.
b. Family Training – “Understanding to Accompany”
A series of interactive events inspired by the Lombardy EFT campaign “AI Awareness at School”:
• Digital Parenting: understanding opportunities and limits of AI in education and daily life.
• Educational Dialogue on AI: guiding children’s interaction with chatbots and educational platforms.
• AI Ethics for Everyday Life: privacy rights, data protection, and environmental sustainability.
• Intergenerational Workshops: collaborative school–family activities, such as ethical prompt design and storytelling with AI.
c. Systemic Actions – “AI Policy and Culture”
- Drafting adaptable AI policy templates for schools, addressing ethics, transparency, and reporting procedures.
• Creating a regional repository of case studies and best practices.
• Launching an interprovincial pilot group (one school per province) to test and refine tools.
• Developing impact indicators to assess the cultural and formative evolution of AI awareness.
4. An Educational Alliance for the Algorithmic Age
AI education cannot be reduced to another digital skill. It represents a moral and cognitive maturation process involving the entire educational community—schools, families, institutions, and civil society. We must foster an AI culture that integrates technical and human knowledge within a shared framework of responsible citizenship.
As the University of Bergamo’s guidelines state, “human supervision is mandatory”—but above all, it is an educational imperative: the assurance that behind every algorithm stands a vigilant consciousness, capable of evaluating, deciding, and correcting.
5. Conclusion
The AI era demands a renewed pedagogical alliance—teachers as cultural mediators, families as meaning-makers, students as active digital citizens. Only through continuous, inclusive, and integrated training can schools become the place where AI does not replace humanity, but rather amplifies its collective intelligence.
Normative and Bibliographic References
- Regulation (EU) 2024/1689 – Artificial Intelligence Act
• Regulation (EU) 2016/679 – General Data Protection Regulation (GDPR)
• AGID (2025), Guidelines for the Adoption of Artificial Intelligence in Public Administration
• UNESCO (2023), Guidance for Generative AI in Education and Research
• Russell Group (UK, 2024), Principles on the Use of Generative AI Tools in Education
• University of Bergamo (2025), Guidelines for the Ethical and Responsible Use of Artificial Intelligence in Teaching, Research, and Administration
• EFT Lombardia (2025), Operational Plan for the Conscious Integration of AI in Schools
Acknowledgments
This article was developed within the framework of the activities of the Équipe Formativa Territoriale (EFT) Lombardia, in collaboration with the Regional School Office (USR Lombardia) and the University of Bergamo. Special thanks to educators and families who contributed through field data, focus groups, and best practice sharing.
