While DNA analysis tells us the order of the amino acids in a protein, it’s much harder to predict its three-dimensional shape which is vital for understanding its function, production and interactions. In 2020, the AI program ‘AlphaFold’ was released. This program can predict the structures of proteins with high accuracy, fuelling a new revolution across many fields of biomedical research, and winning its creators a Nobel Prize in Chemistry.
What can AI do for genomic medicine? If an AI model produces a result that is outside our current understanding, but not necessarily in violation of it, can we trust it? Focus group participants who were presented with this scenario didn’t always have a clear answer. For many participants, this generally reaffirmed the need for human oversight of these tools.
Despite its amazing advancements, generative AI raises substantial concerns. With its roots deep in Western data, could GAI inadvertently become a tool of digital colonisation? Trained mostly on data that is influenced by Western perspectives, there’s a risk of AI systems acting like digital colonisers, spreading a uniform cultural narrative across diverse global landscapes.
Many educators see AI as a tool to enhance the teaching and learning processes, not as a replacement for teachers but to complement their skills. With the release of the Australian Framework for GAI in Schools, educators are being equipped with knowledge and frameworks to guide the responsible and ethical use of generative AI to benefit students, schools, and society.
It might sound daunting to talk to kids about new and complicated technology, but learning is a beautiful and rich experience at any age, and there are plenty of great tools to help you do it either for yourself, or for any little ones you have around you. And it’s important – AI isn’t going away, and by educating young people, we can make sure these new technologies are used appropriately in the future.