RESET with Tonya
Ready to thrive in a world of unprecedented change? Each week, RESET brings you conversations that matter with visionaries, innovators, and bold reinventors who are redefining what's possible in work and life.
We're tackling the big shifts in work, technology, longevity, and purpose – not just with theory, but with battle-tested strategies and authentic stories. Whether you're navigating career transitions, embracing new technologies, or seeking deeper meaning, RESET delivers the roadmap and community you need to transform challenges into opportunities.
RESET with Tonya
Resetting Tomorrow: Dialogue at Scale
Imagine if every person who used an AI also helped design the next version of it. That’s the future MG Alcock is building toward—one conversation at a time. We sit down with the technologist, cultural strategist, and writer to unpack “perspective economics,” the idea that human experience is the scarce asset modern systems need to learn safely and well. From his early days shaping Disney’s internet era to crafting future‑focused narratives for Microsoft’s global CIO summits, MG shows how real change depends on dialogue that people actually trust.
We get practical fast: why the standard pipeline (collect, clean, train, run, repeat) fails at the edge of human systems; how to create the perception of dialogue so high‑stakes rooms stop posturing and start listening; and what happens when companies over‑automate and later rehire for perspective. MG shares field stories that stick—near‑dark factories that still needed five people to make sense of anomalies, a services firm that held at 100 humans because below that, the work lost meaning and resilience. The throughline is clear: perspective isn’t ornamental; it’s operational.
MG also reveals the world of his sci‑fi novel, Ogun Shogun, where an AI named Big Al (look closely at the letters) enacts a “Compact of Perspectives v3” and a free market for perspectives drives better systems. It’s a story designed to seed policy, normalize new roles, and help a wider audience tolerate the future without waiting for crisis. We talk trust, governance, and the hard parts—misinformation, low‑quality signals, polarized incentives—and why learning cultures must feel safe before they can be curious.
If you care about AI alignment, organizational change, storytelling that moves policy, or simply how to stay human as tools get smarter, this conversation is your map and your wake‑up call. Subscribe, share with a friend who leads through change, and leave a review with the one perspective you think today’s AI most needs to learn.
CONNECT WITH MICHAEL 📖
- LinkedIn: https://www.linkedin.com/in/micalco/
CONNECT WITH RESET 🎙️
- Podcast: https://www.reset-podcast.com
- YouTube: https://www.youtube.com/@tonyajlong-RESET
- LinkedIn: https://www.linkedin.com/company/reset-with-tonya
- Instagram: https://www.instagram.com/resetwithTonya
- Facebook: https://www.facebook.com/profile.php?id=61570923056203
CONNECT WITH TONYA 🚊
- LinkedIn: https://www.linkedin.com/in/tonyajlong/
- Instagram: https://www.instagram.com/tonyajlong
- Facebook: https://www.facebook.com/tonya.j.long/
- Check out my bestselling book, "AI and the New Oz: Leadership’s Journey to the Future of Work" available on Amazon [https://a.co/d/aTBJmEr]. Go to the "AI and the New Oz" website at https://www.ai-and-the-new-oz.com/ to learn more!
#thejourneyisthejob
Hello, everyone, and welcome to today's reset with Tonya. We're doing a remote edition because I'm going to be with Michael MG Alcock on Thursday at the Frontier Tech Forum. Michael is, or MG, maybe I should say, is an amazing technologist, but he's also a cultural strategist. And near and dear to my heart, he's a writer. And I can't wait to talk about all three of those things and how they create this fascinating narrative around the lifetime of storytelling that he has produced. Michael knows so much. And I think with that wisdom, he sees so much into the future and where we're headed. So much so that he is writing a trilogy of novels to talk about how he sees the future from a fiction perspective in order to help us get more comfortable with where we're headed so that the technology he's designing will have a place to land and people will understand and be able to be effective with it. And I think that's just a remarkable service of purpose for what he's building. So MG Michael, welcome. We're so thrilled to have you on Reset with Tonya here on KPCR 92.9 FM. Welcome to, welcome to the Bay Area.
Michael Alcock:Oh yeah, thank you. Yeah, the Seattle, you know, I'm glad to be down here.
Tonya J. Long:Excellent. Excellent. So so tell us, right? You're you're a writer, you're a futurist, you're previously you, I I forgot to mention you basically birthed Disney's internet era. Oh well other people might take credit for that, but but you know, you were around at Disney when the internet, you know, really took off, much like AI is taking off now. So with that as context for you, the history of the amazingly impactful and influential things that you've done. What are you working on now that's that's that's giving you joy?
Michael Alcock:Oh well, I mean, the before the writing came the idea. And the idea was born from the experience of going through all of those things that you just kind of uh talked about. Um the internet changed everything. And then we had AI and we had genetic manipulation and we had the age of abundance, and all these concepts didn't exist when I started the journey, but they became clearer and clearer and clearer with each step that we took. And so I became obsessed with this idea of when you can see patterns, when you can see how the future is gonna come, or you know, some of its destinations. You don't know every detail, but you know the destination, you become fascinated with the idea of lessening the pain or making it a better future, or you know, helping people see the thing that they will wish they had seen earlier. Does that make sense? It does, it does. Yeah, and and so that became the fascination. And for me, that means how to make dialogue between humans and AI productive. That's really the the sort of the nut of what I became obsessed with uh about 15 years ago and have been pursuing ever since.
Tonya J. Long:I don't want to downplay it, but I'm in these conversations every day about change, about leading change. And you see the change, you've prescribed like some of the challenges that we will see in living through this era of tremendous transformation, and you're doing something about it. The book isn't the thing, the book is just I don't want to call it a marketing piece because it's not for marketing, it's for comprehension and understanding of to help people like come to terms with what it might look like.
Michael Alcock:Yeah, no, I mean there are a lot of people who can see the difficulties of a world where more and more jobs get replaced by AI. Yes. There are a lot of people starting to feel really worried about it. And 15 years ago, I started to say, well, wait a minute, can I get ahead of the problem? Right? What what what is required to get ahead of the problem? It's not enough to just simply say AI is bad or AI is good or there'll be more jobs that'll magically appear. I needed to think it through in a detailed way. And and when I did that, I came to a relatively simple conclusion, which was an algorithm, because honestly, we don't have real AI yet, we have algorithms, right? But but we yeah, but but the idea of AI is built on our experience with these algorithms today, and our experience tells us that we can build very sophisticated algorithms that can do some amazing interactions with us and can produce art and some writing and and business plans and analysis and summarization and all sorts of things, and so that's our opportunity to feel panic, yeah, right? Just like the first time a computer beat a guy in chess, right? It's the same thing, and so um getting ahead of that, I thought, well, what is that algorithm really? What is the next generation of AI? And the more I thought about it, the more it became clear to me that the current process of building an algorithm, which is gather data, massage data, train a model, run the model, repeat. That's the whole universe right now. That's what we do with everything that we're doing. That won't scale. Because the closer you get to human systems, the more the humans are involved. So even if you get rid of all the jobs, those jobs produce things for humans.
Tonya J. Long:For humans, absolutely.
Michael Alcock:And and so if humans are consuming it, then the run, I mean, train, right, run model is gonna have to become more dynamic. And I thought about that and I said, well, if it has to become more dynamic, we're gonna need a feedback loop. And it's not good enough to have three experts in a room. We need a gigantic feedback loop. We need a feedback loop at scale. Wait a minute, it's not good enough to be a feedback loop. No, it's not like a survey, no, it's a dialogue. Oh my gosh, this has to be a dialogue at scale. Every human being who's consuming the service is also a part of designing the next generation of the service with their perspectives, and they're gonna be self-aware of that, and so we're gonna have to design a system that can handle that. And once I realized that, that's all I worked on for the next 15 years.
Tonya J. Long:I like that. So um early in my tech career, I worked for a company that what our competition liked to say about us was we were the PhDs designing for the PhDs. Now, I don't have a PhD. I stopped at a master's degree, but but uh, you know, it was kind of true. Um, really smart, smart, brilliant group of people. And I think lots of times we're in an AI bubble. People like you and me, this is what we eat, breathe, and live. And I was I was just on a call earlier today with a friend of mine, a professional friend, and she was talking about visiting family down in Southern California over the weekend. There were 30-year-olds with PhDs in in sciences who had not never bothered to use AI. They were not that they're living under a rock, they're just I would say indifferent to it. Yeah. How do you and and and I I like to say on stage that the thing that keeps me awake at night is my family in Tennessee because similarly, they're indifferent to all the things that we see on the horizon that we are so excited about and a little fearful because we want to make it, we want to make it right for everybody. There I I deeply sense that from you, that that responsibility to help people all make this journey successfully. So the yeah, yeah. So the question I have then is is how do you explain what you're working on to someone outside of tech, not involved in publishing?
Michael Alcock:Yeah, I mean, I do it all, I practice this all the time for the last 15 years. No matter who I meet, I try to see what their native state of preparedness is okay, right? Like, because I need a clear picture of that in my own mind, right? Um, and so I'll be riding in an Uber and I'll lean forward and I say, Hey, do you worry about driving jobs in the future? Or what do you have children? You're that guy.
Tonya J. Long:You're that guy. Yeah, and I say, Uh, do you have children? What do they want to study? Yeah, yeah.
Michael Alcock:It doesn't matter what they say in response. What matters to me is the question of do you worry whether or not you that profession will even exist? And invariably I get the answer yes. And even if they're indifferent, even if they think it's all a bunch of hogwash, the worry is real. And so I use that to begin a conversation. But I will agree with you straight off the bat, I have PhD friends who completely ignore AI, or they'll say something cute, like, you know, oh, I have a friend who's making money because it generates 150 videos a minute and pushes them out and makes a profit on ad revenue, blah, blah, blah. Isn't that amazing, Michael? And I go, no, that is not amazing. That is the sign of an impending bubble crash. It's a cheap response. Exactly. And so I have to stop myself from doing that, and I have to say, yeah, no, that's all there's a lot of people. Yes, and yeah, but and there's a next step, right? And and I try to get them interested in that, and that's why it led me to writing a novel, because I realized that I wasn't the only person running around doing this. There are a lot of us on the past, and we're but we're all these people who, you know, people who care about the future and can see the future and are not as focused on the next dollar, right? Yep. And and even though there are a lot of us, we're it's very hard for us to connect. And so we need to kind of put a pin in it. And the only way to do that is with a lot of eyeballs, with a lot of attention. And you already see some people doing that. Uh Deepak Chopra, uh, there's a lot of people all trying to put a pin in it. And unfortunately, I haven't seen any that have the depth of thinking about, you know, this concept of perspective, you know, being shared to drive an algorithm. So um, so I'm just trying to, you know, get that story out.
Tonya J. Long:Yeah. I think being a good storyteller requires being able to put yourself in a lot of different chairs to understand people. And one of the things that we talked about previous to now was how much you love pop culture. And and so I think that that, you know, it uh you are super, super smart. But that pop culture um attraction that you have, I think it I think it makes you uh have more breadth in terms of understanding what people are interested in, how they spend their precious free time. Um, and that sets up for you um a worldview, maybe that that helps you be more approachable and create a product. The book's a product, you also have a platform you're building, um that is reachable to those audiences that because I I I've said since the beginning, AI won't win unless we are all winning with it. Right? It's not just a cloud toy. So what what what from your history and your background uh helped you pop culture, technology, how did those things shape how how you operate, how you show up with these big design ideas and have to convey those to people who haven't had a chance to think about that yet?
Michael Alcock:Well, I I should start with a little bit of humility first and just say, you know, that's hard to do, right? To really get people to understand a concept that is orthogonal to Christmas um uh thinking. You have to acknowledge the the sheer audacity of that problem.
Tonya J. Long:Okay, fair.
Michael Alcock:So that so that when you tell your stories, um they come from a place of we're all doing this together, or this is the future, right? But uh I know you see things differently right now. Um if you don't do that, then you you can only rely on the traditional agents of change, which is a crisis or inevitability. Those are just the two things that are are gonna happen no matter what. Um I take uh advantage of any crisis I can, and I take comfort in the fact that inevitability means eventually someday I'll know if I was right or wrong. But but that's all the comforts that you get, right? So my pop culture interest came from this idea that when humanity got out of just the villages and became global and built all of these amazing technologies, these mediums, telephone, telegraph, television, right? A lot of tellers. But anyway, the the idea was we were still a community, we were still humanity. And I was blown away by that concept. I couldn't believe it. I just thought, yeah, this can only get better. Then I worked at Disney, and then I was a part of the internet, and now I realize oh, it's not all just easy panacea. There are real problems, there are trade-offs that we made when we invented these mediums that we weren't even aware of. And that's when it shifted from oh, I love Star Wars and I love pop culture, and I love the idea of how many scientists chose to be scientists because of what they read and experienced as kids. I shifted from that fascination to, oh, we really need to tap into this. I have to become a professional at tapping into this. And that's what changed for me, really. But before that, when I would manage teams, I would, I would have them all quote Spider-Man. I'm like, great power, great responsibility. You know, I'd I'd lean into all my memes and my pop culture references. And I worked in Hollywood and I did movies and I worked in a newspaper, and I they called me Scoop. And you know, I did all of that before my technology career. And I think that's where the roots lie.
Tonya J. Long:Yeah. Well, I admire, I don't know if that's the right word, but but I admire that you were in the early days of I'm gonna call it digital Disney. Because I think about taking something that is an institution across generations and started out with hand-drawn cartoons, right? And as you came in, what a pivot it must have been for Disney to not just these digital would have happened before the internet, but the internet transition for what it did for the distribution of their products shifted a company that had a deep, deep, deep history in hand-drawn Mickey Mouse cartoons. Was it like for you to and no more than what was it like? What do you bring with you now that you learned through that transition?
Michael Alcock:Well, Disney is a Fortune 500 company that made a huge bet on the internet in their you know mid-90s, right? I mean, that was pretty early. And, you know, that came straight from the top. I mean, that you know, and and I was lucky enough to be a young person who just finished up his tour at Hallmark Entertainment doing TV, movies, and mini-series, and I told my bosses I wanted to do internet, and they thought I was an idiot, and nonetheless they helped me, and I eventually got a job there. But you know, it was the Wild West. It was crazy. We were playing with a brand that has such a cachet that anything we did would be seen by 250,000 people the next day. That it just blew my mind that we were that we had really two years to play with on on that scale. And just any idea we came up with, we could implement. And yes, that we were trying to find ways to be profitable, and you know, two and a half years later, most of us left because that we'd figured out what that version of the internet was about, and you know, the dream was over, but it attracted some of the most amazing people, it created some amazing lessons that I never I'll never forget. And I also know how rare it was to get that kind of opportunity. Companies, I mean, I like to joke about it and say what I learned from Disney.com is to how to bear witness to the wasting of a billion dollars. But um, but that's that's giving it short shrift. It really, there was an amazing amount of learning and crafting of what we call the internet today.
Tonya J. Long:Yeah, yeah. And then, and I'm gonna imagine that at Disney you were younger, right? Because this was, you know, my day it was.
Michael Alcock:Yeah, I was about 10.
Tonya J. Long:You're about 10, of course, of course. And and being younger, you were a doer. Then you transitioned across the years to Microsoft, where I didn't do this justice in your intro, but you led the CIO forums that Microsoft produces as a retention agent for its very large-scale enterprise customers. So you moved from doing, and you didn't do this overnight, but you moved from doing to incredible influencing at Microsoft.
Michael Alcock:Yeah, that and just to clarify, I was the leader of the content for the global CIO summits. I would never want to discredit my part my partners in crime. I love it.
Tonya J. Long:I didn't have that clear.
Michael Alcock:Yeah, no, and I want to be clear because running a summit like that is is a masterful operation and something that thankfully I didn't have to focus on so that I could focus exclusively on the content.
Tonya J. Long:But that puts a finer point on it because you were driving content. And I know from my own work in content through multiple corporations, oftentimes we are defining the strategy to hand the script off to someone who says, That's what I've been thinking, right? And and so those experiences influencing Microsoft's not their coming of age, but definitely their evolution into hyperdrive. Um you weren't just uh following stories about what happened, you were designing stories about what will come.
Michael Alcock:Yeah, no, uh that's an excellent question, actually, because you started with the Disney part. And I was young and I was smart and I wanted to make an impact and I could see what was important and I would try to make it happen. Yep. And I I angered a lot of people. Yeah. And I I learned how organizations naturally defend themselves against people who act like the way I behave. And uh and it didn't, it didn't end well for me, you know, in a lot of situations. Um, and so I it was I'm a slow learner in that regard because I was so passionate, but eventually, by the time I got to that job in Microsoft, I had learned that orchestration or paying attention to all the voices, or even more importantly, something that I've identified as wrong came from something right originally. And and that if I couldn't speak about that, I I wasn't even worthy to be at the table. Because it doesn't matter how much you can see the future, if you can't lead people there, if you can't influence, you're not you're you're not doing anybody credit. I mean, it's fun to you know die a martyr and fall on your side. You know, we all know how joyous that is, but but you know, at some point you have to grow up and and focus on that. And so that job at Microsoft was fantastic. I got to listen to, I got to speak to, you know, six to twelve global CIOs from the Fortune 500 and Forbes 1000 every month. Also, I got to listen to what every major senior vice president of Microsoft's major divisions was trying to accomplish. And I got to sort of craft that into a narrative that was future focused. Ah, what a job, right? It was amazing.
Tonya J. Long:It it was, and I'm sure as you age, there are gonna be moments of time where something happens and you say, I was at the start of that. I advocated for that you know, for that thing to have top bidding in a speech. And so I I I helped bring the attention to that, and now here we are, right? Yeah. We are storytellers. You and I both have that common thread that we are storytellers. I wouldn't even go so far as to say professionally, we are we are professional storytellers. Inherent in that is telling other people's stories, not just our own. So I so I know that you you say that you say that you know you were dominant or whatever words you use to say, you know, you you ruffled some feathers early in your career. But I think you learned from those instances because you wouldn't be at the level of storytelling that you're at now if you didn't learn that you have to pull from many pieces outside of yourself to build the story. Yeah, you're right.
Michael Alcock:I that was a I mean, I won't say that it wasn't difficult because it it hurts to be rejected so thoroughly. Um, but you I had to go through that.
Tonya J. Long:Delivering uncomfortable news is the strength that we both as change leaders and and people weren't I think what I know now is people weren't supposed to like it. I I I have an obligation to the people I serve to try to make it as comfortable and acceptable as possible, but change is not comfortable. Yeah, and so no matter how much Southern I wrap around change, it is still gonna make people uncomfortable to move their cheese to use the old. Yeah, right?
Michael Alcock:Yeah, yeah, it's absolutely it's it's painful for the people who have it as an innate part of their personality. Their safety. Yeah, there's day there have been many days where I wished I could just show up for work, be liked, do a good job, you know, and not inevitably become entranced by this mission, right? Like I've tried. Right. And and I actually would like to meet somebody someday who came from the opposite side who was doing that and yet found that they had the mission to lead change. I wonder how their journey would be different from mine, because mine was I'm gonna lead change no matter what, right? And you know, and and and uh that's a journey in and of itself, and we are we are a people, right? Um, but you know, there's a good reason why we aren't the majority, right? I mean, if you think of us as uh uh evolving for social purposes, right? Um there's a reason why change agents are a lesser percentage of the population. Oh we we wouldn't be able to function otherwise. Yeah, but thank goodness.
Tonya J. Long:I it's good that it's a fraction of us that actually are set forward to do these kinds of things. Because you're right, if the whole world was run around trying to create change, that would we'd have so much disruption, we would never get to the execution.
Michael Alcock:Yeah. So I I respect that that the majority don't need the change most of the time. But those of us who exist in this space, yeah, you know, when it becomes mission critical, it it just lights a fire under us, and we have to build that skill set.
Tonya J. Long:When we talked, you you had a term, oh what was it, the perception of dialogue.
Michael Alcock:Yeah. Yeah, that's an important concept.
Tonya J. Long:Say more about that.
Michael Alcock:So when I was studying this idea that we needed a human to AI feedback loop that so that algorithms could evolve without a bunch of PhDs in a room trying to predict all of human needs. Right.
Tonya J. Long:Um that's that's pretty funny, I just have to acknowledge. But that that that's a cartoon for me. Keep going.
Michael Alcock:So uh so I was thinking about that and I realized something from my days at Microsoft. You mentioned that I was basically had a a larger level of influence than my title would suggest in that job because I was deciding what to talk about by very important people. Right. And and so, because of that craft, I realized that most of my success came from the fact that when you have a major event with people of that caliber and that power level coming into a room for three days or two and a half days, um they are bound by most of the feedback they get in life to not say anything meaningful or dangerous in that room. And so if you want them to engage in actual dialogue, you have to create the perception that this is a turning point, that this moment is different from the normal moment. And from that was born this idea that any system that tries to do AI to human and back and forth feedback loops has to have in integrated into its very context, the context for the existence of the system, the perception of dialogue. Because people will not share their real perception unless they feel like they've received the natural human triggers for this is a moment of change.
Tonya J. Long:They're part of the change. That's the word dialogue to me is it it means collaborative. Dialogue is not a one-way street of I tell you, you go do. Dialogue to me is inherently a team sport. And and so I I I I see that being an environment we have to create for people to feel like they are contributing, not just being talked at.
Michael Alcock:Yeah. Now 99% of all media is talking at you. When I did research into this concept of productive dialogue, I had some real eye-opening moments in my research. I like uh I'll tell you the story about um okay, I I work at Microsoft. This is before I had my CIO job. I was the head of content and community for engineering excellence, which all that means is I was responsible for throwing the event once a year where the 40,000 engineers at software engineers, because regular engineers don't like when software engineers call themselves engineers. But anyway, um, but when software engineers of Microsoft were supposed to gather and have an internally public conversation about the future of software engineering. That was the purpose of the event. It's called uh the engineering forum, right? And I was kind of curious about why we were doing this event and whether or not it was delivering real value. And and so I did a research study. I looked at all of the internally public digital dialogue in the company. So back then it was distribution list. Like you're a part of this DL or that DL for engineers by engineers, right? And with 40,000 engineers, there were a lot of them, right? And I got permission one by one. To accrue all that data and get the whole story. And then I gathered it all for three years' worth of that data and I analyzed it for productive dialogue. And I used seven different definitions of what productive dialogue might mean. From as simple to four responses with some kind of language that indicated that somebody in those four responses might change their mind about something. That was it. I set the bar pretty low. There was zero productive dialogue in three years worth of internally public digital discourse. And that's because nobody was going to risk their job over it. Nobody was going to if there was if there was productive dialogue to be had, it would be had with your product group, with your engineering team, with the people who are close to you, and that you would hope that would filtrate up through the engineering organization. There was no internally public dialogue.
Tonya J. Long:I think that's a that's a product of trust. You stay, you stick with the people who understand your language, who who anticipate that your intentions are the best.
Michael Alcock:Well, and also some of Microsoft's greatest strengths were or you know were an antithesis too, right? Microsoft isn't, yeah, Microsoft is an empowered culture, which is a nice way of saying we all have knives and we're trying to stab each other, right? So um, because if my feature outperforms your feature, I get more resources, I grow, and you don't grow, and then you have to find a new group to belong to. And and that has served Microsoft very well with the original problems of software development, which was can you ship? That was the problem. If you can't ship, what do you do? Right? And so brilliant on my and I've had long, brilliant conversations with the smartest minds in Microsoft about the the very careful design of the product team and the feature team and and who should be on it. And I gotta say, every time I learned something new, I was like, that's brilliant, you know. But it was also standing in the way of right. So yeah, so that so the real question I had to ask was, well, what's the value of productive dialogue in that context? Um, yeah, and so that kind of informed my job, which is probably why I ended up over in the CIO job.
Tonya J. Long:So yeah, I can imagine. We're gonna do a quick station break to recognize Pirate Gap Radio. We are 92.99 KPCRLP in Los Gatos, as well as a couple of sister stations. We have sister stations KMRTLP 101.9 out of what has to be sunny Santa Cruz. And our newest station is 101.9. That's KVBLP in Portland, Oregon. How about that? Probably getting closer to my home time. A little closer to you than to than to us, but we've got a sister station now in Portland, which is wonderful. So um, welcome to Pirate Cat Radio and reset with Tonya with MG Alcocks. So, so MG. And I I want to know the origin. Yeah, let's do it now. What's the origin of the MG?
Michael Alcock:I I've been playing for years with the idea of what goes better with Alcock. Um and and so MG, you know, and also I'd end up in all these startups. Last 15 years, uh I have done many experiments in human to AI feedback loops. And I've learned a lot. None of them took off, but you know, that's part of the learning process. And uh and I kept running into partners who were also named Michael. And so when you're when you have a small startup and there's only four of you and you have to argue about who gets what original name.
Tonya J. Long:Gotcha. The only Michael at startup.com. Yeah, right.
Michael Alcock:Right. So I was like, you know what, uh, fine, I'll just be MG. Yeah.
Tonya J. Long:Okay. That that seems like a pragmatic reason for doing it. I was hoping it was something crazy, like, you know, you you had a collection of MGs when you were, you know, in your 20s. Uh I had a friend in Tennessee who had a little MG and we had a lot of fun in that car.
Michael Alcock:So well, it it it also uh happens to be that um one of the reasons why I had to slow down in life is because I have an autoimmune condition. And its initials happen to be OMG. I know, right? It's really weird, but and so they don't call it OMG. Of course they don't. Of course they don't because it nobody would know what they were talking about, so they just call it MG. So I was almost like I was kind of telling myself to acknowledge myself that this that I, you know, I could yeah, that I that that I'm doing. This is part of me.
Tonya J. Long:I accept it, I embrace it, yeah, exactly, and it has other functions. Very cool. So let's talk about the writer stuff.
Michael Alcock:Yeah.
Tonya J. Long:You're writing a novel. I've written uh I've written a nonfiction business book on AI. Try to put a little story into it because I'm blingy that way. But but you're writing a novel in order to communicate what is inherently a business story, but it's a society story as well, because because AI is not just about business, it's about AI and the impact of all the quantum work that's being done and will be far more than just business. So you're writing a novel, um, and you say that it's a channel for helping people see and understand what's next. So you're and it's a little sci-fi, I believe. I don't know why I'm doing my finger wave, but it's it's a little sci-fi. Um tell us about this world that you're building.
Michael Alcock:Well, I from your imagination into I had I yeah, yeah. So here I am trying to make the world a better place, imagining human AI dialogue at scale and all the advantages it could deliver to society. But if you're gonna write a novel, you need conflict, you need tension. And it's so much easier if you're gonna write a sci-fi novel to make it post-apocalyptic and you know, dark and dreary. And I even got to ask a famous sci-fi writer, William Gibson, that question once in a session. I said to say, why do you make everything post-apocalyptic? And he's like, Well, it's a lot easier to design a plot around that, right? If you're gonna introduce the concept of a new technology, get rid of everything else and just make everything a crisis. It makes the story go real easy. And so I had to look at this idea I was so passionate about and think about the new problems we'll have. The problems that come after the why don't I have a job anymore? Right? Like, you know, and and I found those problems were fascinating, that they were terrifying and fascinating at the same time. So so even though the world was better, it also had these major problems. Like, if an AI algorithm needs feedback loops in order to grow and evolve and better serve humanity, what about the unpleasant parts of feedback loops? What about the feedback from human beings who are not having a great time? Like a human being who's in a in a field of battle, in a war. I was like, oh yeah, that's right. The perspective of war doesn't just disappear, because if it disappears, then you don't have those perspectives. And so I I was I I I built a novel around that. I called it the Ogun Shogun, which is a combination of two words. Uh, one is a South African, no, West African uh deity who loosely translates into a deity of technology. And uh and I'm and I'm sure I'll come under some criticism for that, and I I'll accept that criticism. And then the word shogun, which is the great leader, right? The war leader. And so I imagined a far future where there's one planet in the entire universe that's completely designed and dedicated to the perspective of war. In other words, it's designed to perpetuate battle. And I said, well, what's the most humane version of that that I could possibly imagine? And that's what the novel became.
Tonya J. Long:It's post-apocalyptic world that you've created, and you you can use the term perspective economics.
Michael Alcock:Yeah.
Tonya J. Long:So why does that construct matter in a future where work in society? You don't look at anything like they look now.
Michael Alcock:Yeah, well, I mean, it all comes down to a simple scene, right? I've got I've got a character named Abe, and he overwrote all his memories. Uh so he deleted all his existing memories and he overwrote them with uh uh fake memories that he invented from our time, right? So America right now, before before generalized AI or anything else. So he's a stranger in a strange land, and he suddenly has a new best friend, Chuck. Chuck is amazing because he illustrates perfectly what I mean by perspective economics. Because Chuck finds Abe, and Abe is in trouble. He's got to go to prison, he's got people after him, there's all sorts of things going on. We're on the planet of war. Chuck is just a construction worker, and he's just along for the ride. And at one point, in a big fantastic trial situation, the judge says, Hey, Chuck, you haven't done anything wrong. You can leave. Like, this guy's about to go to prison, you don't have to go. And Chuck goes, No, no, I'm gonna stick around a little longer with this guy uh for perspective purposes, and everybody just takes that as normal. They go, Oh, of course you would. That's something that just wouldn't happen in our society today. But this is a construction worker whose value to society was showing up at a construction site and having a perspective on the construction, and he's just decided to shake things up by following around this wanted criminal. That is perspective economics. Nobody blinks an eye at that in this made-up future because they understand that Chuck's perspective is inherent is an asset, and it's inherently more valuable if he follows his nose and and looks at the things he's interested in.
Tonya J. Long:You're gonna meet somebody on Thursday named Jesse Anglin. Jesse uh lives in Court d'Alane, Idaho. Uh, I met him uh through through this framework. Um, and he he's a digital labor expert. He's built he's been building AI agents for 10 years, maybe.
Michael Alcock:Oh wow.
Tonya J. Long:Um, but he uh grew a company in mostly in India to 300 people, and then as AI became more mainstream, he reworked his processes and he took it from 300 down to 100. And he made a point when when we talked on this podcast that he could have taken it far, far less than 100. But he he didn't call it perspective economics, that uh that this is why but I think you'll enjoy it. Um, but he said, what fun would it be to run a company with no people in it? So he could have automated down to the you know the the just the knit of what required human hand, but he's but for him he felt like the number was a hundred to uh have anything that was worth doing, you know, in terms of having people involved. And I consider that perspective economics.
Michael Alcock:Oh, it absolutely is, and I I have one better. I I met three guys once who worked for a company up in Canada, and this company all it did was it had the right kind of paint to paint the little transformers that appear every so often along all the power lines in every country. Okay, oh wow, is yeah, so so so the company had automated this to the point where they they didn't need anybody in the building. The truck backs up, the and they fired everybody. And then they realized that when something went wrong, nobody had a perspective. And so they had to hire five people back. And I was like, Well, what do you do every day? They're like, Well, we make sure that we understand that it's doing what it's supposed to do, so we we comprehend it, and I was like, that is the I need to know more about you. Like, you know, these guys were I think they felt a little uncomfortable because suddenly I was like, no, no, I have to know.
Tonya J. Long:They hadn't had anyone interested.
Michael Alcock:Yes, because to them they were like, Oh, this is a crazy and I was like, no, that is the future. You just you just proved my point. You know, I love it. That's what'll happen. And what I'd like to see is our nation or our world not have to reach the point where everything collapsed and then realize they need perspective, right? I'd like to I'd like to learn from those those examples.
Tonya J. Long:Like it or not, like it or not. I and I'm a little biased here. My master's degree is in public policy, and I think the world runs on policy, runs on you know the influence that policy has. But I believe also that storytelling, whether it's journalistic storytelling or or more you know long-form prose like the novel that you're writing, as an opportunity to influence, advance, even educate policy and governance into being able to easily synthesize and see the people that policy is intended to serve. So as you journey through your storytelling into, you know, arguably a novel is one of the highest forms of storytelling and commitment to a story. Uh, what are you finding about storytelling and fiction and how it influenced, how how you believe as your story evolves, how it might have the potential to impact the constructs we have around governance and policy?
Michael Alcock:Well, I've been fortunate enough. Well, I said that I've been working on this for 15 years, and and the first seven years of that was 7,000 conversations. I'm doing that math. That's a that's yeah, okay. That's a lot of conversations. It is, and and and um I just kept trying to have the conversation about this thing that I was inexplicably passionate about with as many people as I could. And I was fortunate enough to encounter some really significant policy people along the way. Now, admittedly, most of the time, my storytelling hadn't refined enough to make any sense at all. But as I got better at expressing this fundamental concept of what we say matters, and what we say will put food on the table and a roof over our heads and allow us to raise the next generation. When I when I started to be able to tell that story more eloquently, I was fortunate enough to have some really high-level political people to talk to. And there's a simple reality in policy. Policy doesn't change until it can perceive the need for the change, which is why I'm writing a novel. We have to popularize a concept so that it can reach the people who can make that concept a reality. Uh we didn't get the cell phone overnight, but Star Trek was using it well before. That's right. Right? We we have to imagine perspective economics, or we'll never have you know, actually, that's not true. It's inevitable, but there'll be a lot more pain on the way. And so if I can help my friends who write policy, it would be to give them a platform to uh to tell the story with to popularize the concept, to get voters to go, oh I get it. Okay, this is that thing, that thing that you know. And so, you know, and and if I'm not the one to do it, uh how can I help the person who is, right? Like, you know, but we have to get this story out there.
Tonya J. Long:And you made a good point when you mentioned Star Trek. And I don't care if it's Star Trek or or Terminator Skynet, both served very different purposes in educating people on, you know, I clearly I'm a huge AI advocate, but I think on the Terminator, Skynet said, I get the Skynet question all the time. And it but I think it's good that we understand with great power comes great responsibility, and we will be developing the power to build and and and have technology building itself in ways that we need to be very conscientious about. And that's what Skynet taught us, or at least for me. So so these are great examples of how storytelling influences our ability to tolerate the future.
Michael Alcock:Yeah, I mean, in the novel, I had to introduce my surrogate for policy. And my surrogate for policy is an AI named Big Al. Because when you type Big Al, it looks like AI. And so Big Al, everything that's big al in my novel is really the things that could only happen if policy came to exist. And I even describe the policy. It's called the per it's called the compact of perspectives version three. And there's like I have little ports where I make jokes about the laws and the subsections and the but I can't spill it out because that would be boring, and I need lasers and sexy women and handsome men and you know lots of things to make it. Interesting. But it's not gonna, well, it's not gonna sell, right? Right. But if I can if I can create a story that sells, that feels that has a heart and is is joyous and amazing, but also has the compact on perspectives version three in it, with Big Al as the stand-in for I mean, you know. But here, look at it this way The Wizard of Oz. Do you remember why that was written?
Tonya J. Long:There were uh there were uh tell tell us because there were competing um interpretations of what it was really trying to influence.
Michael Alcock:You're right, you're right. And I could get in a lot of trouble saying the one version and you're not acknowledging the others.
Tonya J. Long:So uh forgiveness, but it was it was but I think it's fair to say most people think it was about the about the economics of the time. It wasn't about a little girl who who dreams her way through a different world, i.e., if the Emerald City, you may not remember my first book was about the Wizard of Oz. It was the it was so so anyway, so yes, I'm as I get all worked up. Yeah, yeah. You're like, I know a thing or two about that. Um yeah, but yeah, it it was a story arguably crafted to send a message in a way that people could tolerate it. That's right.
Michael Alcock:And whether or not it worked and all sorts of things, you know, the the attempt was made, right? And and we have a lasting visualization in our minds, the root, the ruby slippers, the follow the yellow brick road. Like these are all concepts that are just ingrained in our culture. And we need to ingrain concepts in our culture, whether we call it perspective economics or not, we gotta find the one that sticks, right? And and the first step to doing that is to put a stake in the ground. And the ogen shogun is that stake. And if you don't like it, come talk to me. Yeah, let's figure out how to put a better stake, right?
Tonya J. Long:You know, and and I think I've had to redo things before you don't tell a story just so everybody goes, wow, that was a great story. I think you tell a story to create a platform for better dialogue, right? I mean, that's that's the goodness that comes out of it, is is it creates a vessel for the conversations that need to be had.
Michael Alcock:Well, and it and it also addresses some of the the showstoppers. Like somebody says, oh, free markets this, like uh in my novel, free markets are absolutely operating. It's called perspective economics for a reason. It's a free market for perspectives. Like the we there are so many showstoppers, there are so many easy rejections of the reality to come. Yeah, and you kind of have to tackle them and and and make sure people feel comfortable. At least that's the idea.
Tonya J. Long:Yeah, I got you, I got you. Speaking of tackling things, um so much is gonna change, and we are already overwhelmed by the uh volume and the velocity of change. But I think we're gonna all need to learn to learn differently. And I could I could go into a whole nother hour of of learning with curiosity and all that good stuff, but just the fact that we're going to have to become more of a learning culture, which inherently opens up acceptance of change in order to want to learn. Um what do you think it looks like for organizations and even maybe societies to truly transition into global learners?
Michael Alcock:First of all, I'd like to acknowledge that whatever version of a society that is always open to change is, that version must feel safe. Okay. That's what that's what our visions of our communities do for us. We believe in something because it makes us feel like we're on the right path. And so if constant change is the right path, we have to believe that the system that governs that constant change is safe. And so I spend a lot of energy in these experiments that I do and even in the novel that I write on how do I convey that just implicit safety. Right? Um, and that's number one. So I've seen many organizations try to become change organizations or learning organizations. And the truth is that if you don't tackle the issue of the things that make you feel uncomfortable, you're going to revert back to being a static organization. Um, and that's okay. That's you know, that's where we started, but but we have to find the formula at the next level, and I think it has a lot to do with studying the nature of dialogue because all change has to be accepted, and the only people who can accept it are the people themselves. You can't change the person with training, you can't change them with the metrics that you drive, you can't, you can only change them when they decide to change themselves. And so, what is the best mechanism for a person trying to change themselves or deciding to change themselves? It's dialogue. It's it's a mirror that reflects them to themselves in context, in the context of the rest of the organization. And and I've tried many times to build that, and I will never stop trying to build that, but that is what you need. And it also invents a new type of professional. Because organizations that embrace that need the leaders of those organizations need a team of people who are experts at recontextualizing a bad message. And I don't mean spin and I don't mean marketing, I mean taking a truth that is encapsulated in a nasty, scary, uh, legally scary, you know, con uh construct.
Tonya J. Long:And helping people understand and tolerate. I've said tolerate a lot in this.
Michael Alcock:I know, I know, but leader, but leaders require that. Yes. You can't be a weak leader. You will not survive. You must be strong, you must protect your the perception of your strength, and the perception of dialogue threatens that. And so you need a team of people who are going to protect you from that.
Tonya J. Long:Well, and just to challenge it, challenge it's not challenging you, but you mentioned as organizations that would remain static. And what and what what just popped into my mind is we won't have any of those because they won't survive.
Michael Alcock:That's right. Yeah, they will go away.
Tonya J. Long:They they will self-select by being they're the new dinosaur, but yeah, but the but the problem is is that, and I say the problem is it's okay.
Michael Alcock:I just slipped into futuristic guys, right? That's that that's the sign that I've I've I've flipped the uh, you know. And uh the the problem I see is that I bore witness up close and personal to the mechanisms in in our world that prevent us from doing this. Microsoft did some amazing things in the the day I joined Microsoft was the day Windows Windows Vista went to Marshall. Vista.
Tonya J. Long:Oh my god, really?
Michael Alcock:Yeah, yeah. So I arrived with a crisis like readily readily, you know, lots to shoe on. Right, lots to shoe on. And I and and everybody was like, this is a this is a big giant turd pile. And we'll say a dog. And people lost their yeah, yeah, and people lost their jobs and they, you know, but but the real underlying issues, you know, they had to be addressed. Yep. Otherwise, Microsoft wasn't going to become the stock price that we enjoy today. Right. So the only way to do that was make fundamental shifts in how everything down to the tiniest unit, the team unit, dealt with failure, acknowledgements of learnings, like everything. And even then, it's a high risk of slipping back into a static organization.
Tonya J. Long:Yeah.
Michael Alcock:It's really hard to do.
Tonya J. Long:I see. All right, so we've got just a couple more minutes to do my favorite part, which is the lightning round. Quick answers, fast answers. And honestly, I you know, I'm blessed with really great dialogues in this podcast. Like this one has been a great dialogue. But the lightning rounds are where I learn the most. Where because people are unfiltered, right? They're not trying to, you know, present for an audience, they're just being themselves. So, with that in mind, what book has shaped your life the most?
Michael Alcock:The Hitchhiker's Guide to the book.
Tonya J. Long:That's a good choice. That's a great choice.
Michael Alcock:I mean, nobody, nobody has done a better job than Douglas Adams at taking profound insights and instead of making them exposition, making them profoundly funny.
Tonya J. Long:Yeah, yeah, yeah. It's an art. It is. So what's a place in the world that changed you?
Michael Alcock:A place in the world that changed me? Los Angeles. Well, because I'm from New York. I'm a New Yorker. And when I went to Los Angeles, I hated it for 10 years. I did meet my wife there and eventually learn how to. But but Los Angeles changed me in ways that I can't even really begin to communicate.
Tonya J. Long:Okay. Good good answer. What's a lesson that you learned in your time at Microsoft that you still knowingly apply today and recognize it as something that you learned during your time at Microsoft?
Michael Alcock:I well, I mean, the thing that I learned the most was be liked. I mean, yes, Machiavelli had a point. It is better to be feared, but that's not how the modern world works. The modern sorry, I'm gonna fix my microphone real quick. Um the modern world, it we have to get along. And productive dialogue can't happen if we're at each other's throats. So find a way. Find a way to do that.
Tonya J. Long:I think those are important pieces of wisdom for you to share.
Michael Alcock:You know what? I shouldn't have said be liked, I should have said be likable.
Tonya J. Long:Yeah, yeah.
Michael Alcock:Yeah.
Tonya J. Long:Yeah. Yeah. No, but I like it. I think that's a great response. So back to your book. Back to your book. Like what's uh an intriguing or Surprising thing in your book that um inspired that you were inspired by and it would surprise people when they read it.
Michael Alcock:I think the most inspiring thing is how you really have a hard time telling the difference between an AI and a human. And you think that's inspiring? I think it's very inspiring because I think we're done with the age of the Terminator question. We made a lot of movies on that. Very good job. Yeah. Love them, you know. Uh watched them all. I'm a collector of them. But we it's time to move on. It's time to move on to the next stage. Uh Matrix tried to do this with, you know, some other things, and I'll give credit where credit's due to a lot of anime from Japan, and you know, but but we have to we just have to be like, wait a minute, let's just assume that's gonna happen. Now what? What is life like?
Tonya J. Long:I love it. So my last question question for you today, and this is less lightning round and more close uh the right way to close. What role do you hope that you're writing and your platform, which we'll have to talk about in another episode? Um what role do you hope those creations play in helping leaders and society find agency in the transition that we're coming into?
Michael Alcock:Well, right now I find myself prioritizing for two things eyeballs and investment. Right? Eyeballs because there are I'm not alone. This this problem is real. We're facing it. We're in we have a nice little bubble, which will be followed by a nice little crisis, which will then be followed by the next phase. And there are so many of us, and we shouldn't be working alone. We should be connected. So that's the first thing that I hope to get from this. And the second thing is that the the the experiments that I do, the the the companies that I try to launch, like this next one, the platform that we'll talk about next time, it's it's important to figure out the there are real problems with dialogue at scale. It doesn't work unless we figure out these problems. How do you deal with lying? How do you deal with the the absence of critical thought? How do you deal with, you know, I mean, the biggest problems in dialogue have a lot to do with our mass culture? And they're not going to go away. They are a permanent part of who we are, otherwise, we're not who we are. So, you know, how do you build a system that accommodates all of that without itself being the problem? I'm I I just I can't wait. I mean, that's just I get so excited about that. I have to bite my tongue though, because we're not ready for that part. You know, we have to get the first part.
Tonya J. Long:It's coming, and I think a lot of us see it. So it's good. So it's good. So people are gonna listen to this and be like, I'm intrigued. How do I how do I follow this guy? How do I know when the book's coming out? What's the best way for people to watch your space so that they can be informed and participate even?
Michael Alcock:Well, I'm I'm a big believer in not having too many platforms, you know, uh social platforms.
Tonya J. Long:I've got to follow that lead to get.
Michael Alcock:Yes, because the more you commit to, the more work you make with yourself. Yes, it's true. So I don't I don't launch a presence unless I'm gonna commit. I am committed to my presence on LinkedIn.
Tonya J. Long:Okay.
Michael Alcock:And I am committed to the eventual website, because I I tend to not launch the websites until the product exists, uh, called MIGVOX, M-I-G-V-O-X.com, which will be the next iteration of me trying to do collaboration at scale, I mean dialogue at scale, human to AI feedback loop. Okay. And this time it'll be for writers, creative writers. Good.
Tonya J. Long:I will put those in the show notes so that people that didn't write it down quickly enough don't have to hit rewind. They can they can know and follow and see how this evolves. Because what I do know is you and I have been working at this truthfully for 25 or 30 years. It has evolved. And what we knew even five years ago is nothing like what we know now. So that means that I have accepted that what I know now is only only only just a tinge of what I'm gonna know in five more years. And that's exciting. I love it, that's where we're headed.
Michael Alcock:Yeah, it makes it a lot easier for me to fail at all these attempts. I don't think you failed because I think you well, you know, I do I like using the word failure because I like fully embracing the negative aspects that people will project onto it and being happy anyway, and being happy and having people see that. And I love that.
Tonya J. Long:And I think there's no better way to end this show with the concept that we can embrace happiness, no matter how much is changing or shifting or out of our control, to be frank. Um but embracing happiness is the thing that's uniquely humanly controlled, and that's that's where we're we're at.
Michael Alcock:That's a good perspective to have.
Tonya J. Long:Thank you. It it takes work, it does. So thank you so much for joining us today. I can't wait to see you on Thursday. And um, thank you for what you've brought to our audiences with this message. I think you know, we'll have to have you back in six months as things have evolved further with what you're doing and hear more about what you're seeing differently by then. So until then, everyone, this has been reset with Tonya with MG Alcock on KPCRLP 92.9 FM in Los Gatos and KMRTLP 101.9 FM in Santa Cruz, and last but certainly not least, KVBELP. 91.1. It took me a minute. 91.1 FM in Portland. Everyone have a wonderful day. The weather here's gorgeous. I know I always say that. So the weather here is beautiful, and we will see you same time next week. Take care. Thank you. Oh, and Michael, we do this. We make a little heart. Oh, so give me the heart. Awesome. That's the way we end our shows. So thank you. Thank you for participating. It was so much fun.
Michael Alcock:Thank you for having me. Thank you for having me. It's a pleasure to be here.
Tonya J. Long:It was a wonderful conversation. Everyone, take care, have a great week.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Silicon Zombies
Silicon Zombies
NEO Industrialist
TENSEUR Capital
Speakers Who Get Results
Elizabeth Bachman