- Paradigm Makers Moonlit Minds Journal
- Posts
- Paradigm Makers Moonlit Minds Journal: Edition 10
Paradigm Makers Moonlit Minds Journal: Edition 10
The history of Corporate Buzzwords and creating a shared language
Welcome to the 10th edition of Paradigm Makers Moonlit Minds Journal.
Table of Contents
Jess’ Monthly Reflection
This month, I've been reflecting on terminology. It began with last month's historical evolution of mental health, really piqued my interest when I attended Restorative Journey's 2-day Restorative Practice Training, and was confirmed as this months focus when I saw Daniel Muggleton’s comedy show last week (more on why later).
The language we use to describe things is important. It can create communities of belonging or exclusion. So this month, I was curious to explore the evolution of three of our favourite corporate buzzwords - culture, innovation, and productivity. This is not a new concept for me, having previously shared an article on LinkedIn on ‘What playing Hashi can teach us about strategy’. Initially, I’d planned on making that a regular series called ‘Beyond the Buzz’, but then I started this newsletter, and I moved it lower down my list of ideas.
Restorative Practice Training reinforced the importance of creating shared language in the workplace. This is something I’ve actively been campaigning for since my days in a ‘traditional role’ as I saw time and time again how frustrating it was when we entered a conversation, generally about innovation, without understanding what we were actually talking about, or committing to. Intentionally creating a shared language helps us understand what we mean when we say words like innovation in a specific context. This clarity should then help us build new paradigms for tomorrow.
With shared language in mind, I’m also in the process of rebranding Paradigm Makers. Even though mental health is at the core of everything we do, our mission is much broader. I’ve created a 50-year strategic plan, and I recently realised that if I want to make this vision a reality, this is the company I need to start building now. I have no idea if it’ll work, but I know, if it doesn’t, at least I’ll have a great story for this year’s Annual Failure Report!
Enjoy tomorrow’s Flower Moon,
Jess Price
Founder & Chief Vision Officer
EXPLORING THE HISTORICAL CONTEXT
Where did corporate buzzwords come from?
It turns out buzzwords have existed for as long as humans have worked (who knew?!). Yet the corporate buzzwords we know and love today all have their roots in the Industrial Revolution. This isn’t surprising, as the system of work we continue to use today was also designed for the work of the Industrial Revolution. Below are the lifecycles of three of our favourite corporate buzzwords - culture, innovation and productivity - from the Industrial Revolution to the Digital Age.
Culture
In the Industrial Era (Late 1800s-Early 1900s), the term culture was largely absent from corporate vocabulary. Instead, management discourse focused on efficiency, discipline, and standardisation. Workers were viewed as extensions of machines and organisational life emphasises compliance, not community. Any reference to culture belonged to anthropology or nation-states, not workplaces. This aligned with the power dynamic at the time where authority was formal, top-down, and focused on output. Culture was viewed as irrelevant to the scientific view of labour (which aligns with the distinction between work and labour).
By the Fordism Era (1920s-1960s) the idea of corporate culture began to emerge through concepts like morale, loyalty, and organisational man. Companies encouraged long-term employment and internal cohesion, leading to informal cultures based on hierarchy, conformity, and stability. In the 1930s, groundbreaking concepts emerged around motivational influences, job satisfaction, resistance to change, group norms and worker participation. By the 1960’s these concepts came to be known as the Hawthorne Effect, which was used to describe the “phenomenon in which subjects in behaviour[u]ral studies change their performance in response to being observed.” While still hierarchical, management began using relational language to maintain control by restricting the freedom and responsibilities of those dependent on them (a leadership style known as paternalism).
During the Post-Fordism Era (1970s-1990s) people like Edgar Schein led the connection between culture and management. Managers began to believe that shaping a shared culture, through mission, vision, and values, could align employees’ beliefs with corporate goals. Culture became a tool of ‘normative control’, carefully creating a culture and recruiting people who assimilated into it. This shift from external control to ideological alignment saw organisations attempt to win hearts.
By the Digital Age (2000s-Present) culture became a central component of branding and talent strategy. Terms like ‘culture fit’, and ‘inclusive culture’ began to dominate HR lexicons. While some firms strived to build authentic cultures, others used it performatively, promoting values not reflected in practice. Culture became both a source of belonging and a filter for conformity. Shortcomings in organisational culture were also identified as the main barrier to company success. When evaluating organisational cultures, we can see how power is subtly exercised through narrative, symbolism, and recruitment.
Today, culture remains a critical lever for organisational health, but its meaning is often diluted or contested. Instead of ‘culture fit’, we are talking about ‘culture add’, and employees now demand a genuine culture aligned with psychological safety, inclusion, and purpose. This requires companies to reconcile rhetoric with reality, or risk eroding trust.
Innovation
The Industrial Era (Late 1800s-Early 1900s) was all about invention or improvement, over innovation. Change was viewed as linear, with improvements to machinery or methods seen as engineering feats rather than innovations. Most companies preferred efficiency and stability over risk, and countries like France discouraged large investments in industrial innovations. Control and repetition were highly valued, with innovation an incidental byproduct of rapid industrial growth.
The term innovation became more formalised in the Fordism Era (1920s-1960s). However, it was often confined to R&D departments in large firms or companies like Bell Labs. Innovation was a technical term often associated with product design or industrial engineering, and became centralised, elite-driven, and invisible to the broader workforce.
Innovation as a core value in corporate strategy emerged in the Post-Fordism Era (1970s-1990s) due to books like In Search of Excellence, which shared the secrets of McKinsey research and the impact this approach had on the best companies. One of the key concepts outlined in the book was the ‘eight attributes of excellent, innovative companies’. This is also when terms like ‘continuous improvement’ and ‘core competency’ became standard in the corporate vernacular. Innovation became a mindset, rather than an output, with companies using innovation as a competitive advantage.
By the Digital Age (2000s-Present) every company was innovative. This was due in part to the rise of the internet, and the rise of open innovation. Buzzwords continued to grow with terms like ‘disruption’, ‘agile’, and ‘fail fast’ becoming commonplace. Despite the Post-Fordism frameworks, innovation became a branding trick instead of a strategic advantage. Simultaneously, true innovation ecosystems emerged predominantly in technology fields with high rewards. Innovation reached a point where it was both liberating (encouraging experimentation) and exhausting (constant pressure to reinvent).
Today, innovation is expected everywhere, yet it’s rarely clearly defined. We’re told to be innovative without understanding what that means, or that true innovation doesn’t happen in isolation. I like Bill O’Connor’s definition of innovation from 2012: Innovation means ‘making connections to bring something new to the world.’ Real innovation requires taking risks, thinking creatively, and feeling comfortable in uncertainty.
Productivity
Interestingly, the term productivity was rarely used during the Industrial Era (Late 1800s-Early 1900s). The most influential management book of the 20th century, The Principles of Scientific Management by Frederick Winslow Taylor, introduced the world to the concept of ‘scientific management’. The purpose of the book was to frame efficiency as a moral imperative that could be applied to all kinds of human activity. Any mention of productivity was technical, describing how much a worker or machine could produce in a given time frame. This is also the time we started hearing about ‘best practice’, ‘output’ and ‘standardisation’. Productivity was considered a byproduct of control, calculated by managers, extracted from labourers and justified by science.
Productivity began to emerge as a formal economic indicator and tool for national progress in the Fordism Era (1920s-1960s). The Industrial Era’s emphasis on efficiency and standardisation continued, with tasks broken down to mechanised steps designed to maximise workers’ productivity. Productivity was used to justify wage structures, industrial policy, and management authority.
By the Post-Fordism Era (1970s-1990s) productivity shifted from factory output measure to an individual performance expectation. The rise of knowledge work reframed productivity as how much value one person could deliver, which was, and remains, often intangible. Concepts like performance reviews, self-management, and goal-setting frameworks emerged. Control migrated inward, with employees expected to internalise productivity goals as part of their identity.
The Digital Age (2000s-Present) saw productivity become a personal and cultural obsession we could track with apps, dashboards and AI analytics. Workers were encouraged to hack their time, manage energy, and maximise output without regard to how technology allowed them to complete more in less time. Productivity became self-disciplined, normalised through devices, quantified by metrics, and considered a personal virtue we should all aspire to have.
Today, productivity is everywhere, yet we are starting to ask questions about what it means to be productive and who benefits when we are. We are also starting to see shifts to redefine productivity in the age of knowledge work and AI that emphasise sustainable human contribution instead of the endless acceleration we’ve come to expect.
While corporate buzzwords like culture, innovation, and productivity may seem modern, their origins trace back to the Industrial Revolution. These terms have evolved alongside managerial ideologies, power structures, and social expectations, shifting from mechanistic roots to performative mandates in today’s workplaces.
Culture began as irrelevant to early factories, but grew into a strategic tool for alignment, belonging, and, at times, ideological control. Once informal and paternalistic, it is now central to employer branding and increasingly scrutinised for authenticity and impact.
Innovation moved from the margins of engineering to the center of corporate identity. Once the domain of elite R&D, it’s now everywhere, both a rallying cry for transformation and, paradoxically, a diluted label often masking inertia.
Productivity evolved from a managerial calculation to a personal virtue. Rooted in efficiency and extractive systems, it has become internalised, often without questioning its human cost or who it truly serves.
These buzzwords endure not because they’re static, but because they continually adapt, mirroring the changing values and tensions of the systems we live and work in. To use them wisely today requires us to understand their histories, interrogate their meanings, and shape their futures with greater intention.
IDENTIFYING OPPORTUNITIES FOR WORK TODAY
Even when we think we have a shared understanding of what it means to talk about a topic, the only way we will truly know is if we explicitly ask. While Restorative Practice training got me thinking about this, it wasn’t until Daniel Muggleton’s show that I saw how powerful it can be in practice. During the show, Muggleton took several moments to take a knee and share a footnote. The purpose of the footnote was to clarify a joke that might have multiple interpretations or was at risk of misunderstanding. Even though we usually avoid clarifying what we mean, this small gesture created a collective understanding and led to even better jokes because we were in it together.
In the workplace, normalising these ‘footnotes’ is important, particularly when we refer to corporate buzzwords like culture, innovation, and productivity. As history demonstrates, how we interpret these words has evolved, and without explicitly sharing how you (as an individual, team or organisation) define these terms, you're likely to experience conflict.
Creating a shared language is easier than it sounds, even if people think you’re a little weird at first. Here's how Paradigm Makers are normalising explicit language for our work:
Step 1: Paradigm Makers has a NotebookLM notebook populated with our vision of the future, values, strategic priorities and 5 Essential Elements (our “foundational documents”).
Step 2: When a new term arises, NotebookLM is asked to create a definition aligned with Paradigm Makers’ foundational documents.
Step 3: This definition is refined to ensure it complements the traditional definition, whilst also aligning with the essence of Paradigm Makers.
Step 4: The refined definition is added to our internal Glossary (which will be available on the new website).
Step 5: Whenever the term is referenced, a link to the definition is included to create a shared understanding of how Paradigm Makers define the term (this is something you’ll start seeing once the new website is live).
This is also something we do with our clients. During project initiation, we will create custom definitions of our essential elements that embed their company mission, values, etc, into everything we create together. Doing this creates a shared understanding between us, which helps as the project progresses. It means we are thinking the same thing when we say words like culture, innovation or productivity. It also ensures we are actively demonstrating and living things like mission and values, instead of having them sit in a strategic plan or on a poster in the office gathering dust.
CREATING A NEW WORK PARADIGM FOR TOMORROW
Disclaimer: I’d like to acknolwedge there are several words in this section I should have shared Paradigm Makers definitions. I chose not to do this, as I have not yet identified the most appropriate way to share our definitions and I didn’t want to rush this process. For now, Google any terms and refer to the dictionary definitions.
In future, instead of creating new terms to refer to old concepts, how about creating shared definitions instead? As Individuals, organisations and societies, there are terms we all reference. Instead of assuming we’re talking about the same thing, let’s normalise defining these terms. Even if our personal definitions vary, normalising explicit definitions will help us avoid misunderstandings and lead to the inclusive and sustainable future we all talk about. Using our 5 Elements, here's how we could do this:
People move from passive absorbers of imposed terminology to active contributors of meaning.
Language is often handed down from sources of authority, which erases individual and cultural nuance. Instead, let’s build systems where language emerges from conversation, shaped by the people it intends to serve. We can:
leverage the relevant community to share divergent definitions and cultural meanings;
encourage personal and professional development by integrating organisational glossaries into onboarding, feedback and reflective practices; and
allow expertise to shape domain-specific vocabularies, while connecting back to common ground through our ‘foundational documents’.
Under this approach, shared language becomes shared power when people are seen not just as users of words, but as authors of meaning.
Innovation shifts from generating new terms to sensemaking across boundaries.
Innovation often invents new buzzwords that fracture understanding across roles, functions, or generations. This causes us to confuse novelty with clarity. Instead, we should:
practice curiosity by asking “what does this mean to you?” rather than assuming shared understanding;
encourage experimentation among teams to test new terms in conversation, observe the impact, and refine based on feedback; and
prioritise evolving existing terms toward greater accessibility, rather than discarding them for trends.
In this paradigm, shared language is discovered, not invented, through iterative, relational sensemaking.
Technology becomes a bridge for shared language, not just a channel.
Digital tools amplify and replicate language norms. If left unexamined, they can become extractive, rather than inclusive. Instead, we must design:
opportunities to harmonise language across platforms by integrating glossaries into communication tools and updating AI auto-suggestions to reflect group norms;
connectivity strategies that ensure all voices contribute to language formation by allowing multilingual onboarding and asynchronous input on terms; and
analytical tools to map where language breaks down so we can track confusion, misalignment, or drop-off in engagement around key concepts.
Technology could then become a tool to enhance coherence and participation through shared language, rather than just a tool to increase speed.
Economics reframes shared language as infrastructure, not overhead.
Language work is often seen as soft, delegated to comms or DEI functions. But incoherent language erodes trust, productivity and innovation. Let’s reevaluate the role of shared language by:
tying revenue to improved communication outcomes, like reduced friction, better cross-team collaboration or faster onboarding;
mapping expenditure of time and energy on miscommunication to show the trust cost of undefined terms; and
framing sustainability as linguistic clarity to ensure today’s language choices won’t become tomorrow’s barriers.
Shared language, when done correctly, is a keystone asset in any complex system.
Norms move from implicit codes of compliance to explicit, evolving agreements rooted in candour, coherence, and co-creation.
Traditional systems rely on unspoken norms that reward conformity while punishing ambiguity or dissent. This reinforces us vs them dynamics and inhibits transformation. Instead, let’s redesign shared language as a living protocol shaped by its participants. To do this, we can:
make candour a foundation where people are safe to challenge dominant language, offer alternatives, and admit when terms no longer serve;
use structure to create visible containers, like glossaries, that allow terms to evolve collectively; and
embed value alignment into communication audits, ensuring language reflects shared principles rather than inherited jargon.
We normalise shared language by making it context-aware, participatory, and revisable.
Thanks for making it to the end! Let me know if you’re ready to start incorporating any of these actions into your day to day life.
Reply