Highlights
What it means to be truly data literate
Data as a scalable model
Empowering students through agency
Data, AI and the learning curve
How data informs lifelong learning
AI should be for every learner
Closing the digital divide on campus
ZotGPT
Equality in practice
Responsible AI use
Digital Squared, a podcast
Episode wrap
Welcome to Season 3, Episode 4 of Teach & Learn: A Podcast for Curious Educators, by D2L. Hosted by Dr. Cristi Ford and Dr. Emma Zone from the Academic Affairs team. The podcast features candid conversations with some of the sharpest minds in the K-20 education space. We discuss trending educational topics, teaching strategies and delve into the issues plaguing our schools and higher education institutions today.
We are living in an increasingly digital world where nearly everything we do generates data. This information can be used to come to profound and impactful decisions that create tremendous benefits—but only if we can interpret, understand and put the insights to use.
The trouble is that although many of us are data-aware, we’re far from data-literate. And for the data-driven world of today, and the data-saturated world of the future, our next guest says that’s not good enough. Institutions have a responsibility to ensure all students develop data fluency, something that starts with closing digital divides.
Vice Chancellor of Technology and Data and Chief Digital Officer at UC Irvine, Tom Andriola joins Dr. Cristi Ford on this episode of Teach & Learn.
Together, they discuss:
- what it means to be truly data-literate
- why data is the currency of the future
- how to institutional leaders, staff and faculty can boost their data-literacy
- the role of AI in data
- closing the digital divide
- UC Irvine’s school-wide LLM, ZotGPT
Full Transcript
Dr. Cristi Ford:
What does it truly mean to be data literate in today’s tech centric world? I’m Dr. Cristi Ford and I’m speaking with Tom Andriola. Together we discuss the importance of fostering a data driven culture at our institutions, how to empower students as they navigate the overwhelming influx of information, and how to help them think critically about the data they encounter.
Dr. Emma Zone:
-Welcome to Teach and Learn, a podcast for curious educators brought to you by D2L. Each week we’ll meet some of the sharpest minds in the K to 20 space. Sharpen your pencils. Class is about to begin.
Dr. Cristi Ford:
As I thought about the very first time I met you in January in Long Beach, we were talking about data then. And so having you here thinking about data being the new fluency, it’s ubiquitous in talking about platform play. Can you just start by sharing with our listeners what it means? And given this trend, what do you envision the platforms of the future to be?
Tom Andriola:
Yeah, so we throw the term data literacy around pretty easily without really asking the question that you’re asking, which is like, what do we mean when we say that? And so I’m going to split it out in terms of let’s talk about being data aware. I like to talk about the world has been taken over with more technology, and technology really is just generating more data. And now what’s happening is things that we used to do in this analog form are now being done in a digital form. And what that means is we’re using technology for, and then all this data is falling out of it. This interaction is a great example. In the old days, just a few years ago, you and I would’ve been in a studio and we would’ve been talking across a table, each with a microphone. Now we’re using technology for this and it’s capturing our images.
So computer vision against those images to tell you that I’m a compulsive liar. We can capture the whole transcript and you can evaluate through an algorithm, my communication effectiveness through word choice and inflection, voice tonality. So our world is being kind of consumed now and represented through data, and now we’re computing against it. And so that’s what I mean about becoming data aware. In a simple example I use with people, it’s like how many people have pulled out a map, a paper map and used it to get from point A to point B, versus using technology, the phone, data, which is the map, the route, which is the algorithm. And how many people have really said, “Yeah, you know what? I don’t trust that route. I’m going to do it my way.” We don’t. We’ve come to trust the technology. And so that’s being data aware.
And let me give you an example in the classroom for data literate. If you were to ask an engineering professor, “Tell me the most challenging concept for your students to grasp.” And that professor might say, “Oh, well, fluid dynamics is always a challenging one for them.” Well, that’s an educated opinion, but in this data-driven world, where we have learning management systems, we have all sorts of EdTech plug-ins. You now can look at the learning management system, the level of engagement. Did the students watch the video? How many times did they go back and review the same components? How many questions did they answer? And so now you have this data-informed approach saying actually, it isn’t a fluid dynamics, it’s Bernoulli’s equation.
And so that’s what I mean about being data literate. How do you go back and then train people to say, look, human judgment is still very, very important, but when you complement it, the research says if you complement a data-driven approach or a data-informed approach with my experience and intuitions, that’s the best decision-making framework in terms of delivering the best and most consistent results. So when you talk about what’s data literate, it’s being able to really create a new framework for decision-making that uses data to give you evidence to complement what our brains do for us.
Dr. Cristi Ford:
I’d love your approach as really thinking about this on-ramp in terms of data awareness. I really appreciated how you frame that piece out. And then when you think about that data awareness and the other pieces that you just mentioned, we have all these great data scientists that are typically on campuses. Why can’t we just relegate this work to them and have them work in the corners of the universities to figure these things out?
Tom Andriola:
Yeah. Well, first of all, it’s just an incompatible model for scalability and progress. The question you’re answering, if I go back to the map example, it’s like if I had to call someone to basically say, “I need to get from Irvine to Newport Beach, what’s the best route for me?” It’s not a very scalable model. What we want to do is give individuals agency to be able to do for themselves and teach them the requisite skills to do for themselves, which is what education is all about.
And so this concept of how do you take data, how do you put it into a usable form? How do you push it out to put it in the hands of individuals and teach them how to have agency around how they use the data? Because employee A, B, and C are ultimately, even if they have the same role in the organization, let’s say they’re all advisors, their situations are going to be different, their timings are going to be different, and their experiences will take them to different types of questions.
So to give it a scalable model, you want to create data assets that they then could put in the hands of individuals who need the data at the time they need it, and then teach them to be literate with it, which is how do I figure out the question I want to ask? Go get an answer to that question. And if it leads me to a second question, how do I keep getting there until I get a data informed answer to go along with what my brain tells me I should do?
So we have to kind of push it to the edges. And we’ve seen that in so many different places where when we try to control things through a small central group of experts, it ultimately breaks down as not being the right model, whether we talk about organization or governments, it’s like ultimately we want to push things out to the end user.
Dr. Cristi Ford:
I love the focus that you have on empowering through agency. And so as I think about maybe the work you’re doing at UC Irvine, for those listeners that are trying to do this at their campuses to increase data literacy, how do we build capacity with our institutional leaders or staff members or faculty to be able to see the sustainable change happen?
Tom Andriola:
Yeah, so first of all, I’ve been fortunate to have institutional leadership buy-in around these concepts. So data-informed culture and been able to work with others to build programs and educational professional development programs and on-ramping to one, meet people where they’re at because we’re all on kind of a different level of understanding of how to do this and how do you use the tools.
But then really say how do you meet people where they’re at and give them a chance to develop that literacy and competency and if they want expertise. And so we have an internal skill development platform called Udemy. It’s a competitor to Corsair, which a lot more people tend to know. And we have basically built programs to help people build data literacy and data competency to be able to “give them the agency” and then build communities of practice so that those people are talking to each other and learning from each other.
Because if you’re a more sophisticated user of data than I am, it’s valuable for me to interact with you because you’re at step 10 and I’m at step five, and when we talk, you tell me about what you learned at steps six, seven, and eight, which actually helped me move through those steps myself quicker. And so what you do then is through a combination of educational development, professional development, and peer communities of practice, you helped the whole organization move to a new level of understanding and then competence.
Dr. Cristi Ford:
I was just reading an Inside Higher Education article that talked about faculty burnout. And as we’re having this conversation around building these capacities for faculty, I imagine that you’ve been in conversations where we don’t have time, how for those who are listening that are trying to really turn and have a tipping for it in building this capacity, how can they make the difference to help faculty understand that this actually will save them time in the long run?
Tom Andriola:
Yeah, this is a great question. We’re going through this question as well. We had this very discussion about two weeks ago with our top person for teaching and learning at the institutional level. And this is where I think you talk about data is a foundation and then tools on top of data. One type of tool is, let’s just call it analytics.
The second level that we’re now talking a lot about in every conversation is that two-letter word AI, which is a combination of some analytics and some automation. And I think when we talk about this specific issue you’ve raised, which is faculty burnout, not enough time. This is where the conversation more recently for me has trended towards, well, why don’t we use AI to, again, it starts with the data plugged into some type of intelligent tool. Some people call it an algorithm, but so that it takes workload or cognitive of those very, let’s say, high value individuals to give them time back or give them time to stay in kind of a high mental load space. Because nothing burns out people more who have kind of high level positions but end up spending a disproportionate time on the mundane. And this is where data and then pushed into AI tools can take that mundane and reduce the amount of time that it takes to “complete those tasks.”
And I think as people start to work with some of these tools, I know this has been my experience, some of it is not kind of all just the mundane work, even some of the higher level thinking work. I recently had to do a market analysis and competitive analysis, and I use some of these tools. And the timeframe that that would normally take me to the time payment took me with the support of these tools was, let’s just call it a 10X reduction. And again, so I think when we talk about faculty burnout, the selling point that we’re trying to go to is let’s look at the mundane things that you hate to do that is more challenging for us to be able to pay the TAs in this new world, and let’s figure out if we can automate some level of that.
And I like to talk about it this way. It’s like, look, if we can automate this and set parameters around one sigma, around the norm, around the average, and anything that’s just in one sigma, just let the machine take care of that for us. If it goes outside the one sigma, that’s a trigger for, we need human back in the loop because it’s outside of some accepted tolerances that we don’t want the machine to make that decision because we don’t trust it enough yet to have that type of judgment. And so this is a growing narrative and I think we’re looking at what are those three to five places that faculty members get the most frustrated with workload, and can we target data plus AI to really reduce the amount of time they spend in that space?
Dr. Cristi Ford:
So I really appreciate that focus there as you’re talking about the faculty side and really thinking about institutional leadership and all the different hats we wear. But we’re talking today about digital literacy, and so I’m wondering how would that look different for students who are increasingly going to be moving into a workplace and a marketplace where the marketplace is saturated with data? How do we start to have that conversation differently?
Tom Andriola:
Yeah. The survey results are starting to come out now as universities are surveying their students about what are their perceptions of AI, where does their brain go to? And I think what we’re seeing is that even in the student population, even though people say it’s like, “Well, they’re adopting this stuff much more quickly than let’s just say those of us who have a few more years under our belt.”
The reality is that their curve looks like a distribution curve as well, right? There are people who have jumped on it very, very quickly. There are people who are still kind of sitting on the fence a little bit, I don’t know about this. Is it safe? Does it protect my privacy? Do I give my data away if I interact with these tools? And then there’s a population that is, let’s just say disinterested at it. So I think it’s all the same.
What we do know is that AI is going to be an all of us thing. When I say that, what I mean is that it’s fundamentally, again, looking at it from a long-term perspective, it’s going to fundamentally reshape our lives and our economies. And you have to think the internet has done that for us. And you have to put AI, intelligence and automation through data, is going to be in the same realm when we look back 10 years from now.
So we need to figure out what that means from the standpoint of our students and the time that we have with them. And that’s everything from exposing them to this in the right way and context so they get a comfort level with it, both in terms of their educational endeavors, but maybe even more importantly, post-education endeavors as they go out and start career and start independent lives. And so we’ve now talked about this from the standpoint of, well, it’s not just a class. It’s really how do you think about maybe there’s a class and we’re going to launch our first kind of making sense of AI type course that lots of students can take, lots of content that students can “absorb” and skill build around through that same Udemy platform that our employees use.
But then also, how do we infuse the right conversations? This is where the topic of responsible, ethical, and equitable access and use of AI is really important because if we don’t take a role in that, they’re going to learn it after they’re through with us. We’ve asked ourselves, “Do we have a responsibility there in terms of our mission to bring these topics into discussion while we can shape the way those discussions happen?”
And I think we come to for a majority of our population answers yes, that we need to put that into the classroom program. You need to put it into co-curricular programs. We need to really have speaker series and bring people on campus to talk to our students about these topics as well as… And look at the people we’re hiring today. We expect them to be facile in these types of AI concepts and tools for the jobs we’re hiring for today. And that’s only going to increase this going forward. It’s really helpful our students to hear that from people who are in the next chapter for them to say, “Gee, I maybe should start using these things as part of some of the classes that I’m working in where my professor says it’s okay to use it.”
Dr. Cristi Ford:
Yeah, I think you’re spot on there, in terms of the ways in which we need to really influence the curriculum, really thinking holistically and comprehensively around this work. Before we jump to hear a little bit more about what you’re doing around AI, I just want to go back to a book that you introduced me to and then had a conversation with Michelle Weiss about The Hundred Year Life. And for listeners if you haven’t heard of this book, what I really appreciate about the premise is that in the premises around we are moving away from this three stage life where you learn, you earn and you rest. So if you were living to 115 years old or 125, how are you going to financially sustain yourself and really thinking about how these stages have typically been connected to your age or adolescence, the different facets there.
So as you talk about data and the potential societal economic implications for institutions like UC Irvine, if we move away from this three stage approach, what should institutions be doing differently and how should they be thinking about data in a very different way?
Tom Andriola:
Just for context, so the book, it does a really great job of talking about this kind of new way of thinking about life and not in these three very distinct stages, but something that’s a lot more iterative. For the developers out there, so think waterfall to agile is kind of the analogy in our world, but it’s like you need to really think through a lens of my financial assets need to be thought about a lot differently. My health needs to be thought about a lot differently. My social capital, maintaining relationships, we know how positive relationships and maintaining those, especially in your elder years is really important to maintaining a long life or longevity.
And then coming back, Cristi, the thing that we’re talking about today. It’s like if we’re not just going to work till 65 and then retire and live out our years sitting in the sun in Florida or California, but we’re going to have this, we’re going to continue to contribute in some work capacity or some mission-based capacity, how do we maintain skillsets so we don’t get passed by what the marketplace is hiring for? We have to think about our skills, knowledge, and competencies differently. So that’s what we’re talking about and that’s how it fits into the context of the model.
Well, okay, now bring that back to an institution like where I’m at. Well, between the concept of the world just moving a lot more quickly, then the re-skilling happened a lot more over the course of let’s say a career. Then the concept of a degree program becomes somewhat incompatible, especially to working professionals who say, “I’ve been in this field, this field is shrinking and the growth deals are this, this, and this. How do I take a combination of the things that transfer and the gaps that I have that I have to now go fill? How can I do that quickly?” I can’t take three years to do that. I need three months to get to a certain level of confidence so I can effectively go interview for that job.
And so I think this is forcing universities to come back and think about lifelong learning concepts. How does lifelong learning fit into the traditional model or complement or sit adjacent to the additional model, right? One university I was with very, very recently, I think it was at one of your events, they talked about it, Columbia Plus kind of building on the concept of Disney Plus, but that you have a lifelong Learning Plus subscription that came along with going to Columbia for a bachelor’s or professional program. And I love just even the thinking of that is outside the box for traditional higher education and much more compatible for what I think the world of the next 50 years is going to look like.
And so if you’re a traditional twenty-year-old, this is going to be the normal world for you. For me, I am adapting with the way the world has been to the way the world will be. And so I’m having to make that pivot and inflection mid-career. But for our traditional students or even non-traditional students who might be in that 25 to 35 range, this is going to be a majority of their working career where they’re going to be thinking about re-skilling opportunities to trade-offs of time and money to be able to get to that next job at that income level. And so universities have to figure out how they respond to that market need because that’s going to be the prevailing need going forward, not the four years go to college and maybe some percentage just come back and get a master’s degree. And we’re already seeing that in our conversations, and I’m sure most of your audience is as well.
Dr. Cristi Ford:
Yeah, completely agree with you. And I even think about IPEDS data or any of the institutional, the IRB collects or the institutional research groups collect, we’re going to even think about those challenges differently. It’s less going to be about a first-year student. It’s going to be less about classifications, but how are we thinking about other kinds of data that we can use to quantify those students that are upskilling and will continue to come back to us over and over again? So I love the model with that Disney Plus effect in terms of really thinking about how do we extend the partnership between institutions and individuals who will come back to us time and time again as they pivot to take on different careers.
Tom Andriola:
And then coming back to something we started with, which is all of this is going to be captured in data. And most of the states around the union right now have some type of cradle to career type data set that they’re building to look at that longitudinal journey to be able to figure out what works, what’s working that we should double down on, and where is it not working and where are our biggest gaps and where do we need to invest into the next program. So data is informing how we need to change for the future. Some people call that real world evidence. Well, the real world evidence game is coming into education and career development in a big, big way, and we’re just going to see more of that.
Dr. Cristi Ford:
Well, leaps and bounds. I want to turn back to the conversation we were having early in AI. As the vice chancellor of data and technology, I understand AI falls under your Office of Information Technology, so I’d love to hear a little bit more about what your office is implementing and why. But first, I think I’ve heard you say the AI area is an all of us thing. Can we just start there and you share with the audience what you mean by that?
Tom Andriola:
Yeah. When I say it’s an all of us thing, think of the internet, right? It’s like the internet came, there was a set of early adopters who did things with it that are amazing. Some of those have become dominant companies, like Amazon is a great example of an early adopter into the internet as a platform to reach people, to give people what they were looking for in a different way.
And then we spent literally two decades trying to talk about something we call the digital divide, right? The haves and the have-nots, and we’ve made tremendous progress. Maybe we haven’t made a hundred percent, but tremendous progress in closing the digital divide. Why? Because it’s an all of us thing. We all need it. We don’t ask people whether they know how to use the internet in a job interview.
Dr. Cristi Ford:
Not any more.
Tom Andriola:
Hell no. No one answers the question with, “Well, I’ve never actually used it, but I have some friends who’ve used Google. Is it something that’s required for the job?” I mean, that’s a ridiculous question, but I’m old enough to remember when that was not a ridiculous question. So I put AI in that category. I say it’s all of us. We need to think about, everyone needs to be exposed to these tools, which means everyone needs to have access to these tools, and that’ll come back to important about our strategy at UCI. And then we need to invest into helping people understand how to leverage these tools for the needs that they need. And that might be to educate themselves or to get a better job or to navigate where they’re going on the next vacation. It doesn’t matter, but everyone should know how to use these tools like everyone knows how to use the internet.
And then if we really want to create a more equitable world that we all want to live in, we need to take on topics like responsible, ethical, and equitable in the conversation and teach those as kind of core values around this new capability that everyone has access to. And so that’s what I mean when I say the all of us program is like this is going to touch every one of our lives. It’s going to touch every aspect of our lives. We need to make sure people have access to it and that they use it in the appropriate ways, in the ways that we want them to use it. I mean, think about the conversation around misinformation already that we’re having around AI and how it might affect things like elections. This is an example of responsible ethical use that has to be brought into conversations if we want people to trend towards positive uses of it versus manipulative uses of it.
Dr. Cristi Ford:
As I’m listening to you talk about the ethical considerations and the like and thinking about equity. As I chat with individuals, we al know it’s important, but what I’m finding is there is this chasm between understanding in theory and principle that these things are critical, but then understanding and create a roadmap to be able to close that chasm to understand what we stand on morally and then what we need to be doing in our institutions. And I don’t know how you all are working through some of those challenges and thinking through those things, but I know that those are the conversations that I’m having with institutions on a regular basis to kind of feed through that.
Tom Andriola:
Yeah, they are. And usually when I’m up with other institutions ask this, how are you thinking about this? Right? So some of the words that are in that conversation, have you enacted policies? Are you going more the route of guidelines? How much of it is institutional versus putting the responsibility for defining allowable and not allowable with the instructor?
So those are some of the, I do a lot of comparing notes. It is really early, there is no kind of right answer and people make choices. And then the real interesting thing is talk about, and so how well has it worked, right? I mean, how well has it worked in terms of the intended behaviors that you want to come out of that? Have there been some unintended consequences that have come out of that? And if you had to do it over again, would you choose the same course or would you go different?
And I think we all need to be learning from each other. At least I’m thinking it across. I need to be learning from each other. And that’s why I usually say yes to panels is because if nothing else on the panel, I get to hear some other perspectives on how they’re trying to do this. And I find that very interesting and important.
Dr. Cristi Ford:
Really good, really good to hear that through those lifting all sails and having those conversations, there’s intellectual opportunity for us to really all start to rise around this conversation. But I want to turn to talk about UCI a bit and something that I understand is called ZotGPT?
Tom Andriola:
That’s right.
Dr. Cristi Ford:
Can you tell me a little bit about what it is, where it came from and why it’s important for your faculty, staff, and students?
Tom Andriola:
Sure. So first of all, you got to answer the question about the Zot. People are like, “What’s up with that?”
Dr. Cristi Ford:
What is Zot?
Tom Andriola:
So our mascot is the anteater. It’s the only anteater mascot in the United States. They’re actually very ferocious creatures. You do not want to take on any anteater, but the Zot, and this really goes back to people who understand looking at Saturday paper cartoons, Zot goes back to the anteater in the old cartoon BC, and that was the sound that character made.
And so the wisdom of the university back literally in the late sixties adopted the anteater and this sound as Zot Zot, Zot as our kind of chant. So many things that UCI are just Zot something. So when GPT came out, we had to go with ZotGPT because of course we couldn’t be any more creative than that. So this is our version and here’s how it came to pass. So these tools became available. The early adopters were there, they were using it, and then when the better versions came, those people whipped out the credit card or the P card and said, “$20 a month, I’m going to pay for this.”
We were really watching and talking to the early adopters and really learning what are you doing with it? I really use that as the first peer community of practice, but then in my role, I had to run around and say, “Okay, for the people who haven’t started engaging with these tools yet, why?” If my job is to make it an all of us thing, then my job is to get 100% of faculty, students and staff at UCI working with these tools. So why aren’t you using it?
And there were two primary answers that came up most frequently. One is I hear it’s not secure. Two, I hear it’s not privacy respecting, meaning if I put my data in here, that means I’m giving my data to Google, I’m giving my data to Open AI and I’m not cool with that.
So iteration one of ZotGPT was just to take the commercial tools and the simple analogy I use, it’s like putting a fence around your yard, and so anything you do in your yard is protected. Nobody’s coming into your yard, none of the stuff, the ball’s not getting outside the yard. It’s just your protected territory where you can play and experiment and get comfortable with these tools and basically invited people to say, “Look, the two biggest concerns you have, why you haven’t engaged. We just took them off the table for you.” So now we saw that next big bump of adoption and people going, “This is actually pretty cool. Guess what I just got it to do.”
Now we’re still not at a hundred percent, but that was version one. Then we also listened in what those people were telling us, and that has informed the development activity over the last that’s been going on over the last about five or six months, and we’ll roll out version two, which will have some advanced capabilities this fall with the rest of the year. Now there’s an interesting side story here, which is when we wanted to roll this out in January of this year, the faculty was like, “We don’t want to give this to students. We don’t think we’re ready for that.” And so you have to pick your battles. And so I’m like, “Okay, cool. It’ll just be faculty and staff.”
Dr. Cristi Ford:
This year?
Tom Andriola:
This year, I’m talking January 2024 is when we flipped it on for the institution. But we came back in March and we basically said, “Hey, we have an equity argument to make here. We know our students are using these tools. We know the students with more means are paying for better tools than others. If we believe in equity, we need to level this playing field.” And so we proposed turning ZotGPT on for our students, so everyone has access to the best models, which right now for OpenAI, that’s ChatGPT 4.0. And so the faculty wasn’t any concern really. They didn’t even argue much with us on that part. It’s like, “No, you’re right.” Equity is a really important value for us. And so our students now had access to these same tools and that actually got more publicity than we thought it was going to, right?
I guess a lot of institutions hadn’t taken that approach. And so we’ve gotten quite a number of questions and inquiries to explain that. Now, what we’re able to do though is now that we have lots of students playing with it, they’re bringing back ideas and they’re building, I just call them plugins. So specific use case capabilities built on top of ZotGPT, which some of which will roll out to help students and their student experience this fall. So the tool gave students agency, their creativity gave them ideas, the platform gave the ability to bring their idea to life for them and their colleagues. And that’s what this world is going to be about. If you think about how the internet evolved, it was similar things as people took the internet and they built websites and then applications and then capabilities and experiences on top of it. And we’re really trying to prepare ourselves to support that same journey for all of us.
Dr. Cristi Ford:
So for those who are listening, when I made the comment about these conversations around equitable consideration of equity in theory versus practice colleagues, this is what it looks like in practice. Kudos to you all to be able to have this available to all students so that you’re reducing that barrier, that costs to serve for students to be able to have access to a technology. I think these are the kinds of conversations that more institutions are grappling with. They either have not gotten to a point yet where they have a line out of a budget to support it sustainably on a campus wide effort, but it sounds like you UC Irvine thought about this and said, “This is an equity issue and want this to be available for all students.” So just a great call out and shout out to you.
Let’s move a little bit to talking about, I think you talked with some of our producers and you said, with great power comes great responsibility. I think that there were three words that you adhere to when you talk about students and AI use. Can you share a little bit with our audience about that?
Tom Andriola:
Without mentioning Spider-Man?
Dr. Cristi Ford:
We can mention [inaudible 00:31:41].
Tom Andriola:
We all know that’s the reference to those great words. Yeah, well, I think for anyone who’s spending time with these tools and really investing into them, start to understand the power that’s there. I have something I did recently where basically I asked one of these tools to act like a Goldman Sachs analyst to help me study a particular business segment, et cetera. And it did a wonderful job of knowledge that I don’t have, expertise I don’t have, expertise I would’ve paid significant amount of money to get access to, and it did it in a fraction of the time.
And that that’s just my one situation that I’ll share with you. And any one of us can take anything that we’re trying to juggle right now, figure out a plan for and say you could use one of these tools. You can see once you learn the power of prompting, how much you can get it to do, how robust a perspective it can give you in terms of understanding pros and cons and different perspectives and stakeholder views. So this is what I say is that these tools have tremendous capability, which means that if someone grabs the tool for good, they can do tremendous things. If someone wants to do it for nefarious activities, they can do tremendous damage.
And there you’re going to see even more content coming out about how this feeds things like the misinformation engine. And it’s not just about misinformation, but how it can take away ground truths away from people where they start to not even think about ground truths anymore, but just reacting to whatever hits their eye view. So this is where it’s like, how do you prepare? If you think about our students, how do you prepare them to understand that this is the world that they’re going to live in for many, many years to come and it’s not even going to be as complex as it is today? Five years from now, it’s going to be even more complex.
So it really comes down to it and say, great responsibility. We take seriously that the time we have with our students in whatever capacity it might be. We have great responsibility to take what we’ve learned and what we’ve learned about these tools, but also what we’ve learned from analog situations. I use a lot of analogs from the internet evolution in this because I see tremendous parallels, not complete parallels, but a lot of parallels that you can draw. And it’s like part of my responsibility is to tell those stories and connect them to today’s reality because that’s what doing right by our students will be. It’s not just giving them an education, it’s about really preparing them for what they decide to be next. And that’s what I mean when I talk about great responsibility.
Dr. Cristi Ford:
Really, really good. I’ve been looking forward to this conversation so much, and as a fellow podcast host to yourself, I’d love to end with you telling us a little bit about Digital Squared, what it’s all about, and let’s just come from tuning in.
Tom Andriola:
So tying back to something we talked about in the beginning, which is it’s Digital Squared, life in an increasingly digital world, which is really trying to drum into people through speakers and examples of your entire existence is now being infused with more technology, which means more data’s coming out and things that you used to just use analog, your eyes, your ears, your thoughts are now complemented or maybe even replaced with computable data and computation and intelligent software to do those things for you.
Again, I go back the maps example. That has completely been taken over by digital and we trust it to get us from point A to point B without question. And so trying to give people examples from different industry leaders and thought leaders about how that works. Cristi, one of the things that’s interesting is in this world of constant whitewater, I’m wondering whether my topic is becoming a little bit passe because as we move from the digital era of technology and data to the AI era where we’re talking about intelligence and automation, I’ve been thinking about do I need to pivot a little bit the types of speakers and topics that we address because the data exists.
We’re not talking about is there data or where might there be data? The data’s there, it’s now what are we doing with the data that has become kind of top of mind for all of us? And of course, we’re trying to use it to be more intelligent in the way that we make decisions, doing them in faster and faster cycles. And then one of the big things is, “Hey, are we going to turn over some of this decision-making to an autonomous machine?” And so I’m thinking about, “Boy, maybe I need to pivot with the podcast.” We can still call it Digital Squared, but maybe we might need a new tagline because I feel like we are evolving into really a new set of challenges and conversations.
Dr. Cristi Ford:
Tom, I love that you’re upskilling the podcast. I mean, we all have to sometimes make that pivot and shift that conversation because to your point, AI is here to stay. Digital fluency and the things around data are here to stay. And so how can you really lift all sails and build capacity around it?
Tom Andriola:
And I had a mentor earlier in my career, share with me a quote that I still think about and I share. It’s like, “Look, the future’s already here. It’s just unevenly distributed.”
Now, when that was shared with me, I was in the for-profit sector, so I was about market share and revenues and profits, but then when it came into public sector service and you take the second part of that, “It’s unevenly distributed.” It takes on a different meaning. It takes on a meaning of that concept of it’s about all of us. And one of the things I know, because I do work in health and equity, I’ve done work on the digital divide since I’ve been in the public sector. And so if you bring it back to statistics, inequities happen when the distribution curve becomes too wide and not confined around the needs.
And so one of the things I see happening right now with topics like AI is it’s really distributed. You have people who are really, really advanced, and yet people who are just not even paying attention to what’s going on, and that’s when you get significant inequity. It’s also when we start to talk about things like social unrest because with inequity becomes unhappiness becomes anger and response.
And so one of the things in my very little way that I hope that I can compute through the podcast is bring more information so that people can say, “I see that this is the world that I’ve got to kind of move more towards the mean on,” and hopefully we can keep that distribution curve a little bit tighter.
Dr. Cristi Ford:
Tom, thanks again for being here. And if you want to learn more about data and implications of living in increasingly digital world, check out Tom Andriola’s podcast Digital Squared. Also want to thank our dedicated listener, the curious educators everywhere. Remember to follow us on social media. You can find us on X, Instagram, LinkedIn or Facebook @D2L and subscribe to the D2L YouTube channel.
You can also sign up for the Teaching and Learning Studio email list where you’ll be able to see the latest updates on new episodes, articles, and master classes. And if you like what you heard, remember going to give us a rating, share it this episode, and remember to subscribe so you never miss out.
You’ve been listening to Teach and Learn, a podcast for curious educators brought to you by D2L.
Dr. Emma Zone:
To learn more about our K through 20 and corporate solutions, visit D2L.com. Visit the Teaching and Learning studio for more material for educators by educators, including master classes, articles, and interviews. And remember to hit that subscribe button and please take a moment to rate, review and share the podcast.
Dr. Cristi Ford:
Thanks for joining us. Until next time, school’s out.
Speakers
Dr. Cristi Ford
Vice President, Academic Affairs Read Dr. Cristi Ford's bioDr. Cristi Ford
Vice President, Academic AffairsDr. Cristi Ford serves as the Vice President of Academic Affairs at D2L. She brings more than 20 years of cumulative experience in higher education, secondary education, project management, program evaluation, training and student services to her role. In this role, she offers thought leadership and direction to the academic affairs unit of the organization. Her previous roles have allowed her to have impact in education from secondary and higher education settings within North America and as part of the international landscape. Her reach has allowed her to focus on building online education in the US and in Africa.
In addition to her experience building new online learning programs and research related to online teaching and learning, Dr. Ford possesses significant experience in the design and delivery of integrated educational support, training and transition services for young adults and children with neurodevelopment disabilities.
Dr. Ford was selected by the Online Learning Consortium as the 2022 OLC Fellow (the highest professional distinction offered by the association). She is a tireless advocate for quality online education and has leveraged her passion and expertise in many realms in the education space. She is known for utilizing her leadership in extraordinary ways to help institutions build capacity to launch and expand online programming through effective faculty development, instructional design and pedagogical practices.
Dr. Ford holds a PhD in Educational Leadership from the University of Missouri-Columbia and undergraduate and graduate degrees in the field of Psychology from Hampton University and University of Baltimore, respectively.
Tom Andriola
Vice Chancellor of Technology and Data and Chief Digital Officer at UC Irvine Read Tom Andriola's bioTom Andriola
Vice Chancellor of Technology and Data and Chief Digital Officer at UC IrvineTom Andriola is UC Irvine’s VC for Information Technology and Data and Chief Digital Officer. His role is designed to ensure the strategic use of data and technology, drive interdisciplinary partnerships, and champion digital strategies.
Andriola is a global business & technology leader with a broad array of experience in the public and private sector. He is an advocate for progress and equity utilizing technology and data, serving on the boards of OCHIN and Unizin.
He holds a bachelor’s degree from George Washington University, a master’s degree from the University of South Florida, and has completed the Stanford Executive MBA program.