| 4 min read

What 8 Years at Dyson Taught Me About Enterprise Platforms

Dyson enterprise career lessons digital platforms CRM

A Long Run at One Company

Eight years at a single company is unusual in tech, and people sometimes ask why I stayed. The honest answer is that the role kept evolving. What started as CRM development grew into marketing automation, then digital platform architecture, and eventually AI engineering. I essentially had four different careers within one company.

Dyson is a genuinely interesting place to work for a technologist. The combination of hardware innovation and digital ambition creates challenges you do not find at a typical software company. Here is what I learned.

Lesson 1: Scale Changes Everything

There is a fundamental difference between building something that works for 100 users and building something that works for millions. At Dyson, the customer base spans dozens of countries and multiple languages. Every system I built had to handle that scale from day one.

This taught me to think about edge cases obsessively. What happens when someone's name contains characters your system does not expect? What happens when a timezone change causes a scheduled job to run twice? What happens when your database query works fine with 10,000 rows but takes 30 seconds with 10 million?

These lessons directly inform how I build AI systems today. I always ask: what happens when this runs at 100x the current volume?

Lesson 2: Data Quality Is the Real Bottleneck

I cannot count the number of projects I have seen delayed or derailed by data quality issues. In a large enterprise, data flows through dozens of systems, each with its own schema and validation rules. By the time data reaches your application, it has been transformed, merged, and potentially corrupted multiple times.

In AI engineering, this lesson is critical. The quality of your AI output is directly limited by the quality of the data you feed it. I spend more time on data validation and cleaning than on any other part of my pipelines.

The 80/20 of Data Quality

  • 80% of data quality issues come from missing values and inconsistent formatting
  • The remaining 20% are subtle and extremely hard to catch: duplicates with slight variations, outdated records that look current, and valid-looking data that is simply wrong
  • Automated validation catches the first 80% easily. The other 20% requires domain knowledge and often human review

Lesson 3: Enterprise Software Moves Slowly for Good Reasons

Early in my career, I was frustrated by how slowly things moved in enterprise software. Change management, security reviews, stakeholder alignment, testing cycles: it all felt unnecessarily cautious. Over time, I came to understand that this caution exists because the consequences of failure at enterprise scale are enormous.

When an email goes out to 5 million customers with the wrong content or a broken link, that is not a quick fix. When a CRM integration corrupts customer data, recovery can take weeks. The processes that felt slow were actually protecting against catastrophic failures.

I carry this mindset into my AI work. Move fast, but have guardrails. Automate aggressively, but validate ruthlessly.

Lesson 4: Cross-Functional Communication Is a Superpower

At Dyson, I worked with marketers, designers, data analysts, product managers, and executives. Each group speaks a different language and has different priorities. Learning to translate technical concepts into terms each audience understands has been one of the most valuable skills I have developed.

In AI engineering, this skill is essential. When I explain an AI system to a stakeholder, I do not talk about transformer architectures or token counts. I talk about what the system does, how reliable it is, and what it costs. Technical depth is important, but communication determines whether your work has impact.

Lesson 5: Build for Maintainability

When you stay at a company long enough, you end up maintaining systems you built years ago. This is a humbling experience. Code that seemed clever when you wrote it becomes a nightmare to debug at 2am on a Sunday.

The systems that aged best were the simple ones. Clear naming, straightforward logic, good documentation, and comprehensive logging. The systems that caused the most pain were the ones where I had been too clever or too rushed.

  • Write code that your future self (or your replacement) can understand at a glance
  • Log everything, but log it usefully with context
  • Keep dependencies minimal and well-documented
  • Build monitoring in from the start, not as an afterthought

Lesson 6: The Best Platform Is the One People Actually Use

I have seen beautifully architected systems fail because nobody wanted to use them, and scrappy solutions succeed because they solved real problems for real people. Adoption trumps architecture every time.

The most technically impressive system I ever built was also the least used. The simplest tool I ever built is still running in production today, years later. That taught me more about engineering than any textbook.

Taking Enterprise Lessons Into AI

These eight years gave me a foundation that I draw on daily as an AI engineer. Understanding scale, data quality, risk management, communication, and maintainability are not glamorous skills, but they are the skills that separate AI demos from AI products. I would not trade that experience for anything.