Insights

Brownfield

Brownfield Industrial AI: why data integration is the real bottleneck

Industrial AI rarely starts with perfect data. Success depends on pragmatic OT/IT integration.

Most productive sites are not greenfield.

Brownfield means PLC systems, historians, CSV exports, proprietary protocols, manual shift books, old gateways, new APIs and machines that have been productive for years. Industrial AI must create value in exactly that environment.

It is tempting to design the perfect target model first. In practice, that is often too slow. Successful projects start with a concrete operating problem and work back to the data sources that actually explain it.

The same is true for charging infrastructure. Existing CPMS platforms, OCPP 1.6 and 2.0.1, roaming systems, meters, fleet portals, installer processes and vendor logic all coexist. Integration is the real reality check.

The edge is a translation layer between OT and IT.

An Industrial Edge Device can normalize, buffer, validate and enrich data locally. Many sources become one event model that can be used for monitoring, AI, APIs and cloud synchronization.

This translation is not a side task. It decides whether AI can later make reliable statements. Wrong units, shifted timestamps or missing context lead to models that work technically but disappoint operationally.

  • Combine OPC UA, MQTT, Modbus, files, databases and APIs.
  • Harmonize timestamps, units, asset context and quality markers.
  • Buffer data locally so network issues do not cause data loss.
  • Transfer only relevant events and aggregated signals centrally.
  • Version and monitor interfaces instead of leaving one-off scripts behind.

Data integration is product work.

Many AI pilots fail when moving into operations because data integration was treated as a project task. A script reads data, a dashboard looks good, but nobody notices when a source fails, a tag is renamed or a unit is interpreted differently.

Production integration needs the same qualities as good software: monitoring, error handling, tests, versioning, clear ownership and traceable data contracts. This is familiar terrain for NeLeSo because OCPP and CPMS integrations require exactly this discipline.

Pipelet principles also help in Industrial AI.

Pipelet grew out of charging infrastructure, but several principles are more general: building blocks instead of monolithic platforms, clear connectors, local state, event journal, simulators, diagnostics and clean synchronization.

In Industrial AI projects, we apply this thinking to OT/IT integration. One edge building block can connect a machine, another can normalize data and another can provide AI inference or copilot functions. A pilot can then grow step by step into a robust operating architecture.

The right starting point is a data path, not a model catalog.

The question should not be: which model do we want to use? The better question is: which data path must be stable so that an operator can make a better decision?

Once this path exists, AI models become more interchangeable. A simple rule set can already create value, an ML model can extend it later and a copilot can explain technical events. Without a stable data path, every model remains fragile.