In this episode of the Smarter by Design podcast, Susan Strom and I discuss The Modern Learning Organization Pipeline—a framework for helping AEC firms prioritize and maximize the return on their learning and development investments.
For decades, most AEC learning programs have relied on familiar formats: lunch-and-learns, live training sessions, and recorded presentations. But a new generation of tools—AI search, modern intranets, modular learning systems, and knowledge agents—is dramatically increasing the potential ROI of learning assets. When knowledge can be searched instantly, accessed on demand, and revisited whenever someone needs it, learning assets become far more valuable than they used to be.
That shift creates a new challenge: how do firms decide where to invest their time and energy? You can’t manage all the knowledge in your firm. So the real question becomes: which knowledge and learning investments produce the greatest return?
In this conversation, Susan and I walk through how to identify, prioritize, and design high-impact learning experiences in AEC firms using the Modern Learning Organization Pipeline.
Along the way, we explore:
Why AEC firms need to evolve into modern learning organizations
How firms can maximize the value of learning and development investments
The DESIRE framework, a practical tool for prioritizing learning opportunities
Why learning experiences should increasingly be treated like products
How firms are modernizing learning experiences for the AI era
How learning content can become searchable organizational knowledge
Why learner involvement and piloting are essential to good learning design
The rise of dedicated knowledge and learning roles inside AEC firms
Underlying the discussion is a broader idea: the industry is entering a platform shift in how knowledge, learning, and expertise are developed and distributed. As AI-powered knowledge and learning platforms like Synthesis make knowledge more accessible and reusable, the potential return on learning investments is rising dramatically. The challenge for firms is where to invest their time and energy to create the greatest impact.
If you’re leading an AEC firm and wondering where to invest in learning, knowledge, and capability building, this episode introduces a practical framework for prioritizing the opportunities that will matter most.
▶ Watch or Listen
Watch or listen to this episode via YouTube, Spotify, Apple Podcasts or wherever you get your podcasts.
📺 🎧 YouTube
📺 🎧 Spotify
🎧 Apple Podcasts
📃 Episode Transcript
This transcript was lightly edited for clarity.
Susan: Hi, I am Susan Strom. I'm the Chief Client Officer at Knowledge Architecture.
Chris: And I'm Chris Parsons, founder and CEO at Knowledge Architecture.
Susan: Chris published something in our Smarter by Design newsletter, issue 15, about three weeks ago, called the Modern Learning Organization Pipeline. We're going to explore that today. So, Chris, it would be good to start with: what is a modern learning organization, and what is this pipeline thing?
Chris: Yeah, I think it's good to pull back a little bit. We have been observing that there are some challenges and opportunities that our clients in AEC are facing. On the challenges side, we see baby boomers retiring. We see really fast-paced change in terms of knowledge — the half-life of knowledge is shrinking. There's new technology, new standards, new materials, new codes. Your clients are expecting you to do more with less. And we've got this emerging generation — millennials and Gen Z — who are really ambitious and want to advance their careers.
There's also all this new technology. AI has hit in a major way, and there have been changes to video technology too. It's easier to capture, edit, and clip video and use that as a knowledge-sharing technology. This confluence of structural, cultural, and demographic changes, as well as technology changes, has created a perfect storm — in a good way — to really level up the way AEC firms do learning and development.
There's an opportunity to deliver the right knowledge to the right person in the flow of work much more than there ever has been. Part of what's exciting about that — partly due to all this change, and also the retirement and new generation coming up — is this desire to upskill people more quickly, to take on more responsibility earlier in their careers than ever before. That's a career ambition for people coming up, but it's also a business reality for firms. There aren't enough of my generation — I'm a Gen Xer — there weren't as many of us born as there are boomers, millennials, and Gen Zs. A lot of us left the industry after the dot-com bust and then after the housing crisis.
So there's this gap of 15-to-25-years-experienced professionals that AEC firms are needing to figure out how to work around. The technologies have converged to make that possible. What we have built is a technology platform that combines intranet, learning management, AI search, knowledge agents, and really advanced use of video technology to help enable this change. That's a modern learning organization platform.
But in order to take advantage of that new technology, firms have to change some of their processes, culture, and the way they're operating. I've been calling that "modern learning organizations" for about a year. We have been in various betas with these new technologies — including our learning management system. Susan, you've been there with me since the beginning. At the end of last June, we were working with 56 firms adopting Synthesis LMS.
What we're seeing is that firms are working more on the people, process, and culture side than the technology piece, in order to deliver high-quality on-demand learning. In order to get the kind of coverage on topics that are important — so they can upskill people — they're needing to change their approach to learning and development and knowledge management. As a modern learning organization, firms will create and maintain collective intelligence, leverage technology to distribute the right knowledge to the right person at the right time, and become more adaptable and resilient as business and practice conditions continue to change. That's the high-level overview of what we're seeing.
Susan: Something that clicked for me here that hasn't clicked before: the transition to developing institutional knowledge and distributing it effectively using today's tools is similar to the transition from traditional marketing to marketing today. It's producing short videos, producing podcasts, taking knowledge from inside the firm and sharing it with clients in a way that feels accessible and interesting — meeting clients where they are versus where you want them to be, which is listening to your lecture. They're not going to do that, and we just accept that.
Marketing organizations have had to change the way they develop, distribute, and manage content. I feel like the modern learning organization is a similar version of that, but inside the firm. There are big-picture parallels, and also some tool parallels — you need to learn how to produce video of high enough quality, which I know we're going to get into when we talk about designing and building.
Chris: I hadn't connected those two things either. What I like about it is that firms that already have that marketing capability are turning around and using it to create compelling internal content. Just like outside the firm, you don't generally ask experts to work alone to create compelling marketing pieces — they partner with writers, video editors, and so on. We're seeing the same thing internally.
One of our clients is doing a series where they're going through their most senior subject matter experts, doing deep, high-quality interviews, breaking them into shorter videos, and creating those as lessons — but also seeing them as marketing material at the same time. LS3P's Expert Hours is another example — they spoke at KA Connect last year about interviewing their experts internally. Those live internal interviews throw off internal knowledge assets, but they also throw off potential marketing assets or ideas for external pieces. These things can be integrated, not just paralleled.
Susan: So I'm going to take us into the Modern Learning Organization Pipeline, which is on our website — there'll be a link in the show notes. It's somewhat linear: a few steps, then a jump, then another set of steps, then a jump.
I'm going to say where I think a lot of firms — and people — want to start, which is with design and build. It's like: great, I'm going to create a course. We know there's a new toolset, and I know onboarding should be better, so I'm just going to create an onboarding course and see what it's like. And before that, I might think: I need to set up a recording studio because I want this to be high quality, in short snippets, accessible by AI, and modern — I want it to meet my new employees where they are. So I'm going to get the right lights, write a script, practice a bunch, and then record. So it's not only design and build, but raising the bar while you're doing it. I've talked with some clients who've started there. I'm curious what your thought is on starting there versus starting more at the beginning of the pipeline.
Chris: Whenever you have a platform shift like this — and I do consider this a platform shift and a paradigm shift — there's always the temptation to jump to step ten versus starting at the beginning, because you see the promise.
In a traditional learning organization in AEC, the most atomic unit of learning and development is the hour-long lunch and learn — or maybe an hour-long training session, delivered live or over Teams or Zoom. Then it gets recorded and put on the intranet or a server. It's like: if you were there live, great, you get some credits; if you missed it, you can go watch the recording.
What we hear again and again is that those recordings generally don't get seen — or firms don't have the analytics to know if they get seen. Or maybe that hour-long recorded thing wasn't a great learning experience for someone coming to it after the fact.
Then we show firms the technology we've built around Synthesis. You can have courses with short lessons; those courses can be assigned or discovered; there are full analytics to understand if people consume them; there's feedback built directly in. And all that content is searchable in AI search — so if someone has a question about how to do something in Revit or about a sustainability guideline, it can just answer their question by citing that course, without them having to watch the whole hour-long thing.
It's a completely different paradigm. People want to run toward that — and they also realize that existing approaches aren't going to meet the moment. They need to level up the quality, and there's all this old content sitting there. It can become overwhelming to the point where what started off as really exciting becomes dispiriting.
What I tried to do with the pipeline is help people go through a structured process to identify the biggest opportunities, prioritize them, and right-size the learning experience. Let's say you mentioned onboarding — that's a high-impact moment. Depending on your firm, you're going to onboard 5, 10, 25, 30 people a year, every year. Everyone that comes into the company goes through it. There's a huge ROI in nailing that.
Contrast that with a design presentation introducing a new technology — maybe it's mass timber, maybe it's a new auditing framework, maybe a new visualization tool. Some people will use it; sometimes it's almost an elective. You've got this spectrum from high-usage, high-volume, strategic initiatives all the way through things that are helpful sometimes. The pipeline helps you realize not all opportunities are equal. Some won't impact enough people to justify the investment. It helps you systematically think through what matters, to what degree, how to build something compelling, who the audience really is, involving learners to shape the content, and understanding if it's working. It's meant to help people not get overwhelmed and have a more methodical way to move forward.
Susan: One of those prioritization steps introduces a sub-framework — an acronym for testing an idea to see if it's worth prioritizing. The acronym is DESIRE. I'd also like to call out that you didn't use RICE, which is an acronym used in product management that sometimes comes into these conversations. I'm curious if you played with using RICE, and if you could explain what RICE is before we talk about DESIRE.
Chris: Let me take one more step back to contrast what traditional learning organizations are doing versus modern ones. A lot of the way people arrive at that hour-long session and decide what it'll cover is through some kind of professional development committee that looks for volunteers. "We have a slot coming up in April — does anyone have anything cool they want to share?" So it follows an ad hoc discovery of what should be taught, dependent on the enthusiasm, whims, and schedules of whoever's available to teach.
That has served firms — it's not bad — but when firms figure out where they're trying to go with all these emerging professionals who need to become job captains or project managers or senior engineers faster than the old apprenticeship process allowed, they have to become more intentional about which learning opportunities to invest in. So how do you evaluate these opportunities in a rigorous, fair, unbiased way?
That's where RICE comes in. It stands for Reach, Impact, Confidence, divided by Effort. That's the standard product management framework. I took the R, the I, and the E from RICE. Reach is basically: how many people will this affect? Onboarding touches everybody. Project management training doesn't touch everybody, but it touches a lot of people, and AEC firms are constantly creating new project managers. Impact helps you balance that: does it really move the needle? Will it teach a foundational skill they need to do their job better, or is it more of a nice-to-have — almost edutainment?
Susan: I toggle between reach and impact — and I think strategy ends up here too. New employee onboarding: if you're only onboarding five employees this year, that's low reach in the moment, but high reach over time, and huge strategic alignment.
Chris: Teaching project managers to be better at project management rather than throwing them to the wolves — big impact. So the formula is: Reach × Impact × Strategy × Enthusiasm × Demand, divided by Effort. Strategy is probably the most obvious — how well does this learning initiative align with where the firm is headed? If a big strategic initiative is quality improvement, is this aligned with that? If you're trying to move into more planning or consulting work on the front end of projects, does this build capacity for that?
RICE doesn't speak to strategy, which felt like a critical missing piece. It also doesn't speak to enthusiasm — who's going to teach this, and how motivated and excited are they? Are they going to be voluntold to teach it, or genuinely excited? And the last one is demand: do people actually want to learn this? Sometimes people have to learn things they don't want to — compliance courses, for example. But you can also hear from junior people or project managers that they really want to learn X, Y, Z skills. So demand is a factor.
That's a balanced scorecard of reach, impact, strategy, enthusiasm, and demand — then you divide by effort: how hard is this going to be? That's DESIRE.
Susan: I want to talk more about the demand and enthusiasm pieces. I love having those two next to each other. Thinking about the shift from "Susan wants to teach a course about mass timber" to "employees want to learn about mass timber" — people pulling us, saying, "I want to level up, I want to do more high-performance design and I can't if I don't know some basics." Without the demand side, we might build something that people want to teach but not what people want to learn.
Chris: AEC firms should be treating these learning experiences more like products. One of the foundational errors that founders make — myself included — is building something you think is a great idea but haven't validated a market for. That's fun, but it doesn't get you where you want to go in the long haul. A framework like DESIRE helps you ask: do we even need this topic at all? We have a recording from six years ago on a certain topic — maybe a little outdated, maybe we want to modernize it — but do we even need this topic? If you've got 10, 15, 20 ideas, this helps you get some rigor behind your decisions. It also surfaces disagreements.
Let's say your firm has a healthcare practice and you want to create a Healthcare 101 — a 30-minute to two-hour course giving new people the foundations of working in healthcare projects: the jargon, the medical equipment, the building layouts, the typology, the regulations. You and I might score this very differently. I might say the reach score is high; you might score it lower. I might think it will be transformational; you might think they have to work on the project to really learn those things. I might think it's easy to build; you might see it as a big lift because nothing is written down.
By using a framework like this, instead of just debating our tacit feelings, we can say: "Susan, you put effort at a five and I put it at a two — what information do you have that I don't?" It gets that invisible thinking out on the table.
Susan: Do firms need to do some kind of audit to move through this process? I've seen firms that take the time to write down a comprehensive list of their existing learning — even if just within one section, like all their Revit learning — and those firms seem to be moving faster. They slow down to build analytical, thoughtful plans and end up going faster as a result.
Chris: It's go slow to go fast — and not just go fast, but go with confidence. Being thoughtful is also good for team alignment. If I have a suspicion that the Healthcare 101 course is a waste of time but don't say anything, and you go build it, and it turns out to be a waste of time — that wasn't great for anyone. Or if I'm halfway through building and I realize we don't have any of this content written down and have to start from scratch — that's exactly the kind of thing a little upfront prevention avoids.
Susan: Another thing I've seen: when you have an analytical plan and you share it with someone and ask them to record a three-minute video, and they can see it fits into a well-thought-through bigger story that will have an impact — they're much more likely to prioritize it and do it well. They know where it's going and have confidence there's a plan.
Chris: Let me go one step further. Say we go through the exercise and find that you were right — effort is a five. But I was right that impact is also a five. That lets us say: okay, it's going to be a significant lift, but I see why it's worth it, and it sets realistic expectations. It also encourages good habits — the demand piece pushes us to go check with people who've recently worked on healthcare projects and validate: would this have been useful? Or, as we think about who's going to teach it, it helps us identify and bring them in early: "Do you have the energy for this? Do you want to drive it?" These things need to be thought through whether you do it upfront systematically or not. If you don't, they'll come back to you later — ideally not after you've invested 20, 50, or 100 hours into something nobody's using.
Susan: And your employee engagement score goes down. What did we do here?
On the effort side, one question that comes up a lot is: we're used to the one-hour format. Our internal educators are used to it. We want to shift to short videos, break things down, rerecord in a new way — five-minute videos or whatever. Then things start to ratchet up: it needs to be in a studio, it needs to be perfect, the lighting needs to be right. But that level of perfection isn't always necessary to meet the user's need. The audio quality needs to be good, but not every course needs everything to be perfect. Is there a floor for quality?
Chris: Audio quality really matters — there are $99 external mics that raise the floor very high very quickly, so that's an easy investment. But I think about quality more in terms of: is this thought through? Has someone just been going off the cuff, or has care and consideration gone into making something coherent? Is it concise — free of digressions that don't add anything? Is it clear that one thing builds on another? Is it something I can actually use, and in the next two weeks or six weeks?
That kind of quality — the care, respect, and thoughtfulness that went into making it — feels far more important to me than whether we're recording in 4K with perfect boom mics and polished editing. You could have something beautifully polished that isn't saying anything, or is beautifully wasting my time. I'd rather have something quick and simple but clearly thoughtful, designed to help me as a person. Do they feel it was designed for them, or does it feel like someone was just banging it out? To me, the floor is care.
Susan: And I want to draw a distinction between learning experiences you're going to assign and hold employees accountable for versus optional ones. If you're forcing people to walk a path, and it's not a path worth walking, they're not going to want to take another learning journey with you. And by "path worth walking," I don't mean the sparkle of the video — I mean it was thoughtful.
Chris: There's a tradeoff between efficiency for the expert making the content and the experience for the learner consuming it. Just popping on a Zoom and talking for an hour may feel like the comfortable, familiar thing — 45 minutes of slides, 15 minutes of Q&A — but that may not be the best learning experience. This is where the DESIRE score is useful again: if a course has a high score, like project management 101 or onboarding or Revit basics, the ROI justifies putting more energy into it.
The last piece of this is surrounding subject matter experts with support. You mentioned at the beginning that some firms have marketing capabilities for storytelling, writing, interviewing, and packaging video. To the extent you can take a subject matter expert and make it easy for them — they show up with their expertise, you've done the homework and have a theory of a script — anything you can do to make these very busy people enjoy the experience and make it an easier lift is worth doing. So effort for whom is an important question: effort for the expert, for the knowledge and learning team, for the marketing team.
Susan: Right. And a lot of what we're talking about assumes a short video format — five minutes, fifteen minutes. Someone drops in, learns something specific like risk assessment for contracts, can come back to it in three months when they actually have a contract to deal with. But I want to call out that there's also another kind of learning: a conversation between an expert and a non-expert, discussing a policy that's maybe somewhere on the intranet — less focused, but rounding out someone's understanding of why something is important. There's room for multiple modalities. Not everything has to be super linear for it to be a good experience.
Chris: The example that comes to mind is what Kristina Williams has been doing at Lionakis — people will hear more about it through podcasts, webinars, and KA Connect this year. She took a longstanding program — their design technology bootcamp, 52 to 54 hours a year of live synchronous instruction for new people who will be design technology specialists embedded in project teams. What she's been doing is exploring hybrid formats: the instructional parts, if done efficiently in prerecorded form, might go from 30-35 minutes down to 18-20 minutes. They watch those together as a group and then discuss.
In some of the beta cohort calls you and I were in yesterday, the approach was: watch this short video about how we're approaching AI tools, then book a 15-minute meeting with me to talk about what you learned and your ideas. These multimodal hybrid environments are what a modern learning experience looks like. It's not one-size-fits-all.
The side benefit is that people learn better when they can pause, rewind, or break it up. Maybe it's 18 minutes, but you want to pause, get some coffee, come back, watch the next 9 minutes, make notes, open up Revit, try the thing. Letting people go at their own pace is more accessible — and it also means that six weeks or six months later, when you don't remember how design options work, you can come back and rewatch. It wasn't contingent on being live with the instructor. And now that it's digitized, anyone at Lionakis trying to figure out design options can just search, and AI search will pull exactly the answers they need out of that course. You've done a lot by digitizing it — for the immediate learning experience, and downstream for others.
Susan: Something that came out of our work with Lionakis is thinking about retrieval while you're building the course. There was a specific example with Autodesk Construction Cloud and Forma: they were breaking out the courses and realized that "adding users to a project" was something people would need to come back to repeatedly. So instead of burying it inside a broader video, they pulled it out as a separate lesson and video — they optimized for the recall of a specific step they knew people would need to revisit.
Chris: It's like producing Saturday Night Live or the Daily Show — you're thinking in terms of clips and segments that will stand alone as YouTube videos downstream. These are key moments that stand alone but add up to a continuous experience. And that canonical hour-long session — maybe it was an hour because it wasn't as efficient as it could be, so maybe it comes down to 45 minutes. But 45 minutes can still be the right length as, say, five nine-minute segments. In addition to serving learners better, it also makes the content easier to maintain. If something changes that was in minute 37 of your hour-long recording, but you have it broken into a nine-minute segment, you just replace that segment. Less technical learning debt.
Susan: Yeah. And thinking about it that way — long-term strategic value, working backward, thinking about users' needs, thinking about retrieval at the same time you're thinking about the person taking it live or on demand, how AI factors into letting people extract the piece they need — you're thinking on multiple levels. Which gets back to the stress I mentioned at the beginning: this is a huge opportunity, and I don't want to screw it up.
Chris: Which is why it's worth more strategy upfront.
Susan: In the Lionakis story, they rethought the layout of the course because they understood how it would be consumed — just in time, via AI search, by someone looking for specific information. And I feel like understanding that your course might be consumed by someone who has a problem and is just trying to solve it might change the courses you develop and the people who are interested in developing them.
Chris: Beginning with the end in mind. The end in mind is this might get consumed by AI search. The end in mind is people who aren't the target audience but may find useful knowledge. The end in mind is the person who went through the course but doesn't remember it six months later. In the old approach, Lionakis had all those recordings from prior years, but they were hour-to-hour-and-a-half linear assets that were hard to break apart and hard to search.
For Kristina and her team, it's a bit more of a lift in 2026 to redo this content versus just showing up and teaching with the slides they already had. But they believe that in 2027, 2028, 2029, this will save a lot of time. And once this content hits Leo — the name of their Synthesis intranet — that knowledge is immediately searchable for everybody in the company. The return on that knowledge is much higher, which justifies the short-term investment.
Kristina has also been working on practice bootcamps with a subject matter expert who, once she told him his job was just to show up, talk about his expertise for 10-15 minutes, and then leave — was thrilled. "That's all I have to do?" They handle the editing and packaging, create a course from it, run a draft by him. In that case, you're increasing the return for the organization, lowering the investment for the subject matter expert, and maybe increasing the investment for the org slightly — but you still come out net positive. Jess Purcell and the team at Shepley Bulfinch actually calculated out the hours of doing it the old way versus the new way; she goes through that in detail in her KA Connect 2025 talk. Most firms haven't calculated it, but they can feel it.
Susan: This also plays into how we think about resourcing knowledge and learning in our organizations. A minority of our clients have a knowledge manager or an L&D department; the majority do not, but we are seeing those roles on the rise — a combined knowledge and learning manager role in particular. The ROI calculation for how much an organization can get out of training done really well and in a more modern way is changing. To get there, someone needs to do the work, which means giving someone dedicated time for it.
Chris: When we started Knowledge Architecture, my goal for the first KA Connect was to get together the named knowledge managers in the AEC industry for a roundtable in Chicago — maybe 15 to 20 people, because that's about how many there were. We ended up getting 80 people. Those knowledge managers were at SOM and Gensler and HKS and Arup — large, 1,000-plus person firms, generally.
Now we're seeing knowledge managers at 50-to-99-person firms, not infrequently. In the last 10 years we started seeing more knowledge managers at the under-500-person firm. In the last year, we're seeing it at the under-100-person firm. I think we're going to start seeing it at the under-50-person firm.
Firms are running that ROI calculation. Part of it is: we can take load off our experts — not just for knowledge capture and creation, but we can give our experts more leverage. If I'm thinking about Corey Squire at Bora as their director of sustainability — look at what he's built over the last few years in terms of their knowledge base. He was working with a team to build that out, and a good part of his job was to do it. Now they're in the learning management phase. The whole idea is that the expert gets more leverage: they can scale their expertise and spend more time researching, innovating, working with clients, mentoring — versus answering the same 101- and 201-level questions over and over.
And the experience for the employee? I don't have to wait until Corey's not busy. Corey might be on a job site or on vacation. I can keep going in the flow of work without disrupting the expert. There's a flow state that gets unlocked.
Whatever that knowledge and learning management team looks like — whether it's FTEs, whether it's like Kristina at Lionakis, who's 50% design technology and 50% knowledge management; or Katie Robinson at LS3P, who's Chief Marketing and Knowledge Officer — somebody needs dedicated time in the week where that's the thing they're thinking about, not the ninth thing on their to-do list they hope to get through by the end of the week. You're not going to become a modern learning organization doing it ad hoc and under-resourced. You're not going to be able to take advantage of these new opportunities and this new technology. This is a platform shift, and it requires rethinking people, process, and culture as well as technology.
Susan: I think it's fair to say more architecture firms than engineering firms have historically had dedicated knowledge and learning resources. But I feel like that's changing. We just had an 80-person engineering firm say they're hiring a knowledge manager, and I know of another similar-sized firm doing the same thing.
Chris: What are you seeing there?
Susan: Engineers appreciate the math. Smaller engineering firms can be more nimble in adjusting their teams and structures. They're thinking: all our engineers are billable and on projects, our marketing team is running lean, HR is running lean — so where is this going to happen? Someone new. And in some cases, that might look like a creative reassignment. One of our clients had a couple of planners whose project stalled — funding, something like that. So the thinking was: we have about a month of runway where these people are underutilized. Let's make that time intentional and give them something meaty to work on.
Chris: If we're talking about the same organization, the COO is running point on this operation. She's the one who made that call because she's deep in building the modern learning organization for them and connected the dots. But if no one is at the leadership table every day thinking about how to resource this aspect of growth, it's going to be hard to identify the opportunity.
Susan: This is all medium-to-long-game stuff — this is, I'm working on the important but not urgent. I'm working on the business, not in the business. There's like nine ways of saying the same concept. I’m sharpening the saw — I can keep going. There's so many cliches around this, but yes, it's that.
I was on the phone with a client this week who isn't on the modern learning organization journey yet but wants to get there. And I’m going to pull back to AI for a moment. I've seen firms go too far in one of two directions: AI is fantastic and is going to do everything for us, or they're afraid it won't work. One thing I haven't seen work is just pointing AI at 30 years of experience on a server and expecting it to answer any question. Part of why I don't think that's possible is that data is super messy — there's all the "final_final” and “no_really_this_time_final" files. A person who hadn't worked on that project wouldn't know what the right answer was, let alone AI.
The firms making the most progress don't believe in that magical thinking. Yes, AI will help do some things, but you still need to build the digital knowledge and learning foundation that will allow your people and AI to understand how you work, what you've done, what your approach is.
Susan: I think the best firms are treating AI like a new employee. You don't hire someone unless you think they're going to give you ROI. And then you onboard and train them with the same intention you'd use to make any new employee fully productive.
Chris: Capturing knowledge is how you train an AI like Synthesis — writing it down. You don't ask a new employee to read your 10 to 20 terabytes of project files and then do their job. You figure out what job you hired them for, give them the section of the knowledge base that pertains to doing that job effectively, and measure them on that. AI intelligence is a bit broader — it can do multiple jobs, not just one — but the same principles apply.
Another way of saying this: the modern learning organization is thinking about the learning and development of both its employees and its AI — the agentic layer that's going to be part of your business. I can't say this with 100% confidence, but today is probably the worst AI and the least AI we will ever have in our businesses. If you believe that we'll have more AI and it will be better — and making an investment so that the AI understands your projects, your people, your standards, your processes, your culture, your values, just like you'd want new employees to — you have to capture it and put it in a format that is accessible to both humans and agents.
Susan: And bringing this back to the Modern Learning Organization Pipeline: developing good courses is just one part of it. It does help train the AI. But it's this whole flywheel: AI's ability to surface content when the learner needs it is driving the creation of new content, driving the resourcing of that content, and driving firms' willingness to invest in developing it. Because you're not just giving it to the five new employees who joined this year — you're giving it to all your employees and to the AI.
Chris: Exactly. And once people understand that this is the end game, it changes the ROI calculation. The piece that's not here today but isn't hard to imagine is AI tutors in platform. It's not just "Susan wants to learn to be a project manager today" — it's that you go through Project Management 101 supported by an agent and AI search that can answer your questions. And further down the road, an AI that understands Susan specifically and can mix and match resources and learning experiences to help her on her particular learning journey. That future isn't too distant, and the step to put yourself in position for it is around knowledge and learning management — which brings us back to: do we need better-resourced teams to get there? Yeah. It's exciting.
Susan: I love it. Back to strategy — we're going to get some spreadsheets out of this podcast.
I want to talk about end users and when we involve them in the design process of a learning experience. In the pipeline, learners are named as people you should consult around right-sizing and design-build, but they can certainly show up in earlier steps too. There's a risk I've seen play out where bringing people in too early — before having something material for them to see or understand why you're talking to them — can derail things. There's information asymmetry between what the learner knows about the big-picture learning opportunity and what the champion or learning team knows about what they want to accomplish. So when is the right time to bring learners in to incorporate their feedback without overwhelming them?
Chris: I do have them in step one — there's an informal conversations with learners piece in there, around building the list of opportunities to consider. That's meant to counterbalance the tendency we all have — having been emerging professionals ourselves — to think we know what would be useful based on how we remember it. From our more senior position, we think: they need to know this. They may not know they need to know it, but I need them to know it. Those things are both true, but things have changed since we were emerging professionals. They may know of new technology or practices from other firms that would be great to incorporate. Engaging learners as part of your identification process is really valuable.
There is information asymmetry, but it can cut both ways — you don't always know what your audience needs, and the more time you spend talking to them, the better.
There's another area where it's really important: I've been hearing the argument that attention spans are super short and everything needs to be 90 seconds or two minutes because of TikTok. But there's counter-evidence — people watch the Acquired Podcast, Joe Rogan, binge entire TV series. If something is interesting enough, people will keep going. I'm skeptical of the claim that people don't have the attention span. I do think people have lost some endurance — Cal Newport's newsletter referenced a piece about NYU Film School students having trouble sitting through full-length films — but I think if your content is valuable enough, people will finish it. Which is another argument for on-demand content and letting people go at their own pace. Talking to end users to understand how they actually consume content is useful data.
In the issue I published today, I linked to a clip from Todd Henderson at Boulder Associates. What they did was have learners become the teachers: they had a healthcare expert named Kate who had built medical planning tools, and the strategy was to have emerging professionals interview Kate, become the expert on the tool to the extent that they could present it to the office and document how it works. That offloads burden from the expert — both from teaching and from being the perpetual go-to person for questions. There are a wildly interesting number of ways to involve learners.
Susan: In the pipeline there's a pilot step — and if that's the first time you involve learners, you're probably going to have to rebuild a lot, because you've spent significant time and money building the thing and you're just crossing your fingers.
When we're launching intranets, we do an early adopter phase about three weeks before launch: bring in some people who've never touched the intranet — not content editors, not on the leadership team — and have them give feedback. Information always comes out that helps the team make small adjustments before go-live.
For learning experiences, I think it's a different model. You still need representative learners — not someone else on the intranet core team or the learning team — actual learners. And I think you're going to get information back that requires more than a couple weeks to address before you're ready for prime time.
The other thing I'm noticing is that most clients are launching one or two courses, or maybe one learning path, at a time. That gives you the opportunity to pilot, improve, launch, learn from that, and then go back to prioritization for the next program — carrying forward everything you learned from the first one. So organizations will be running content through the pipeline roughly one content area at a time.
You might be actively distributing your Bluebeam training because it had a good DESIRE score and someone was passionate about it. You're piloting some Revit training. And you're in prioritization and right-sizing for PM training, trying to figure out where to even start with that one.
Chris: And do I do PM training as a monolithic block, or do I start with something specific — like negotiation, or difficult conversations, or fees? Maybe I break it down and go after the most important part of project management that we're struggling with, that people want help on.
Susan: The "everything bagel" approach to the pipeline is actually really good: get things out to users and into pilot as soon as you can so you start getting feedback that feeds into what you're developing next.
Chris: Right. When we go into beta for a product, I'm not trying to figure out in the beta whether doing knowledge agents was a good idea. If I don't know that by the time we get to beta, I haven't done my job as a product manager — I should already understand the core problems to solve, the jobs to be done, the use cases, with a high degree of certainty that they matter and will make an impact.
What I'm hoping happens in beta is surfacing the things you can't see because you have the curse of knowledge. We're seeing clients pilot courses and discover: I didn't understand what this quiz question was asking, so I couldn't make sense of the answers. Or: I don't understand how we went from topic three to topic four — something feels missing. That's the level of cleanup that comes from piloting. Someone's using it in an unexpected way, or something you thought was super clear turns out not to be.
Susan: One I heard this past week: the modality of "watch the video now, then schedule time with the internal expert" — when a pilot user got to that step, they asked: "Do you really want me to schedule time with you?" It was a different way of working that needed more context: yes, this is expected, this is welcomed, here's an agenda, here's a booking link. You're just not going to be able to see that yourself.
Chris: Exactly. I said "book a meeting with me" — how is that not clear? But it wasn't. That's exactly the kind of thing a good pilot catches.
Susan: Every AEC organization is doing some learning, whether it's named or not — otherwise the organization probably doesn't exist. Some of those processes have a lot of momentum. New employee onboarding, project management onboarding — maybe not defined, but people are becoming project managers somehow. Within the pipeline, those existing experiences sit at the end: they're rolled out, people are engaging with them, they're happening. One way to approach them is with the measuring activity, experience, and outcomes step at the end of the pipeline — then using that information to go back to the beginning and rethink your inventory and prioritization.
But there's a lot of baggage and politics tied up in opening that box. Getting all the powerful people involved in, say, PM training to face the fact that it might not be working or that it's being done inconsistently — that's hard. And these things are hard to measure because they're often not formalized. There's no feedback form. So how do you go about capturing information on whether they're working, so you can prioritize accordingly?
Chris: Picking your battles is important, and DESIRE helps with that. Let's say we have a project management course: three straight days, in-person, everyone's burnt out by the end, retention is low, and they don't immediately apply much of it because they're not at the right project stage. You run it through DESIRE and it gets a middling score — not great, but fine. So is this the hill to die on this month or this quarter? Or is there a Healthcare 101 scenario where you have nothing and the gap between where you are and where you could be is so much bigger? That one might make more sense as the next investment.
And when you do the DESIRE score, it might surface that enthusiasm for changing the PM course is very low — or conversely, that everyone agrees it's awful and wants help. DESIRE gives you a depersonalized, depoliticized way to have that conversation.
Once you've identified and aligned around "we need to revamp the PM training," then you go find the data you can: interview the last 5, 10, 15 project managers who went through it. What stuck? What have you never used? If you were going back into that role, or preparing to step into it, what are the gaps you'd want to close, what feels most daunting? You get into design from there. You don't always have perfect data, so you find what you can.
Susan: Yeah. That seems like a nice place to end — continuing to loop back to the user. What do they need to learn? What are they seeking? Let that inform the whole process.
Chris: You and I are aligned that doing some strategy and prioritization work upfront is a really good idea — but it can go too far. You could spend months chasing your tail figuring out the perfect sequence and strategy, meanwhile there's a knowledge gap that could be filled right now. Maybe it's onboarding, maybe it's a new technology initiative, maybe it's AI or safety. Sometimes you can skip the line a bit: we're all agreed this has a high enough score, let's go build something, right-size it, run it through a pilot, without waiting until we have a five-year plan figured out.
As much as the pipeline looks linear — and all models are wrong, but they're useful — this is a looping process. It's almost like a fractal: the organization is in a pipeline, and every course goes through its own phases. Getting started, thoughtfully, is the balance: don't rush in, but don't take too long on analysis either. Hopefully that's a balanced way to move forward.
Susan: This was great. Thank you so much for walking through it. It's a fun way to explore the framework.
Chris: I'd encourage people, as they're implementing and using this and have questions or refinements — this is the first version, and we're going to continuously be learning and unlearning around modern learning organizations and the pipeline. Let Susan or me or the team at Knowledge Architecture know. We'd love to hear it.
Susan: Awesome. Thank you, Chris.
Chris: Thank you.
