versiondog on the Trial of Big Bang
Backing up and monitoring of Industrial control system programs for the Large Hadron Collider (LHC) at CERN near Geneva, the world´s largest particle accelerator, has been entrusted to the data management system versiondog. The European Organization for Nuclear Research uses the system to manage the data of around 500 components, making versiondog one of CERN's standard technologies.
AuvesyAUVESY GmbH is a solid medium-sized enterprise based in Rhineland-Palatinate, Germany. The software versiondog offers worldwide a secure solution for version control and data management in industrial automation. This leading version control software is the basis to heighten production efficiency by reducing errors and downtime; and monitor and control automation projects.
Equipment & Machinery
CERN - looking deeply into matter
CERN was founded in 1954 as a research organisation for fundamental physics. It is located at Meyrin near Geneva. A remarkable international collaboration, the European Organization for Nuclear Research is now run by 22 member states. With an annual budget of over one billion euros, the Organization receives visits from around 11,000 scientists all over the world who are working on various projects. The main focus of their research is the exploration of the fundamental particles that make up the Universe. Powerful accelerators are used to bring particles to near light speed. The biggest is the LHC (Large Hadron Collider).
To reach it, it is first necessary to descend 100 m underground. Then you would need a bicycle if you wanted to follow the 27 km circumference of the collider. Doing so, you would pass several thousand electromagnets, some as big as a freight container, masses of cables, gigantic detectors and many computers. All this to do research on the smallest particles in nature. To help answer questions about the origin of the universe. And other questions, such as why is there far more matter than antimatter in the universe? Inside the ring, elementary particles are accelerated by the electromagnets to speeds close to that of light, i.e. in the region of 300,000 km per second. Then they are smashed together at predetermined collision zones. This causes a shower of particles, the trails of which can be traced and analysed by huge detectors. The Worldwide LHC Computing Grid (WLCG) was developed to handle the enormous quantity of data produced. Dispersed across the globe, this computing and data storage network can deal with data volumes in the order of 30 Petabytes.
"We have made significant gains in certainty and quality when it comes to data availability in areas where many programmable logic controllers are in use. Implementing versiondog has put us on a new quality level," says Jerónimo Ortolá Vidal, Automation engineer at the Industrial Controls and Safety systems Group of the Beams Department at CERN.
- CONNECTIVITY PROTOCOLS
The LHC was commissioned in 2008 to carry out cutting-edge research into particle physics. versiondog from AUVESY is the leading manufacturer-independent software solution for backup, version control and documentation of project data for industrial control systems. It uses standardised workflow and centralised data storage, makes automatic backups and ensures easily comprehensible documentation of each step in the development process. The SmartCompare function enables detailed program comparison with the same familiar presentation as the system editor. Support is provided for audit trail documentation in accordance with ISO 900x, VDA 6.x, FDA 21 CFR 11, GAMP and GMP.
versiondog has been in use at CERN for a year. There is now a centralised repository where the project data of the control systems from Siemens (SIMATIC S7, TIA Portal, WinCC flexible) and Schneider Electric (Unity Pro) is safeguarded and managed. "We want to make all the processes of the control system and their surrounding ancillary equipment homogeneous," says Ortolá. "Our goal is to always have a clear overview of all PLCs and HMIs and all the changes that are made to their control programs. We want to be able to manage all program versions using a standardised procedure, and it is extremely important to us to be able to store and safeguard data centrally. versiondog helps us do that."
- DATA COLLECTED
- SOLUTION TYPE
- SOLUTION MATURITY
Mature (technology has been on the market for > 5 years)
- OPERATIONAL IMPACT
Impact #1 [Management Effectiveness - Operation Transparency]
Management of the various individual processes and control systems is the responsibility of a number of different departments. Within this structure, Ortolá's department provides CERN wide support service. It was with the goal of standardising processes across departments in mind that the versiondog system was introduced. Staff now have a much clearer picture of processes and their current status. Any and all changes are comprehensible and visible to everyone. And they can be undone if necessary. Furthermore, a backup of all data is performed once a week.
Impact #2 [Efficiency Improvement - R&D]
"Errors made while modifying programs have been reduced to an absolute minimum since we have been using versiondog," explains Ortolá. "Centralised data storage guarantees that we are always working with the latest and most recently approved and released program versions." But if for some reason a system goes down, the latest version is immediately available. What’s more, the system checks that the version running on a control system (the online version) really does correspond to the latest version that was saved on the server (the offline version). Before versiondog, changes could go unnoticed. Regular online-offline comparisons ensure that this can no longer happen. If a discrepancy is detected, the system informs the appropriate administrator by email. All this means that versiondog has led to an improvement in quality and an increase in the level of work process standardisation at CERN.
- QUANTITATIVE BENEFIT
Benefit #1 Research and development in a facility such as CERN never comes to an end. Ortolá is continually adapting the versiondog system to the changing needs of the Organization. When he needs support, he gets it directly from the data management specialists AUVESY. There he can find the help he needs with individual elements of configuration and with broader adaptations of the system to new conditions. Benefit #2 Ortolá sums up: "versiondog makes it possible for us at CERN to safeguard our control system data and store it centrally, which is a crucial element for the Organization."
- USE CASES
Process Control & OptimizationProcess control and optimization (PCO) is the discipline of adjusting a process to maintain or optimize a specified set of parameters without violating process constraints. The PCO market is being driven by rising demand for energy-efficient production processes, safety and security concerns, and the development of IoT systems that can reliably predict process deviations. Fundamentally, there are three parameters that can be adjusted to affect optimal performance. - Equipment optimization: The first step is to verify that the existing equipment is being used to its fullest advantage by examining operating data to identify equipment bottlenecks. - Operating procedures: Operating procedures may vary widely from person-to-person or from shift-to-shift. Automation of the plant can help significantly. But automation will be of no help if the operators take control and run the plant in manual. - Control optimization: In a typical processing plant, such as a chemical plant or oil refinery, there are hundreds or even thousands of control loops. Each control loop is responsible for controlling one part of the process, such as maintaining a temperature, level, or flow. If the control loop is not properly designed and tuned, the process runs below its optimum. The process will be more expensive to operate, and equipment will wear out prematurely. For each control loop to run optimally, identification of sensor, valve, and tuning problems is important. It has been well documented that over 35% of control loops typically have problems. The process of continuously monitoring and optimizing the entire plant is sometimes called performance supervision.