A digital ecosystem can only thrive if there are clear and fair rules for sharing data

A digital ecosystem can only thrive if there are clear and fair rules for sharing data

A digital ecosystem can only thrive if there are clear and fair rules for sharing data

A digital ecosystem can only thrive if there are clear and fair rules for sharing data

Sunday, April 28, 2019

Neon sign below a window overlooking a skyline.
Neon sign below a window overlooking a skyline.
Neon sign below a window overlooking a skyline.
CEO of tapio.

Christian Neumann

The data world needs new rules, or who owns which digital twin?

Where are we today?

The rules of today's economy date back to a time when primarily physical, and therefore tangible assets, set the tone. In a world in which intangible assets are becoming increasingly important, the question must be asked which rules should apply. Data is undoubtedly one of the intangible assets that most needs clarification. Although the analogy "data is the new gold" is not entirely correct, it is true that data plays an important role and will only grow in the future. Data are already driving many important fields of innovation today. In combination with AI, data will become even more important. The WSJ has stated that very well in an editorial: algorithms will be the strategic decision makers in the future and not the raw data. Already today we are "drowning" in data worldwide and "starving for knowledge."

The challenge is that the ownership of data cannot really be clearly regulated and the legislator has not yet clarified the field. Although the DLT (Distributed Ledger Technology) world promises that this problem could be solved, this is not yet a reality, and even if it were, this point could never be addressed purely technically. Because it remains the case that you can copy data as often as you like without any loss of quality. If someone sends me something digitally, I can usually forward it without any problems.

Besides, in the last few years, well, almost the last two decades, platform capitalism has spread that monetizes the data of users in a very intransparent way. For a long time, this was an acceptable deal for the users, "I give you my data, and you seem to give me your service for free." One has to ask oneself whether the majority of users really understood what deal they were getting into. But this system has gone amok in the meantime, and we all learned in private what happens when you make yourself totally transparent without control. The scandals of recent years, be it stolen passwords or data misuse of all kinds, have helped to sharpen awareness. The B2B world, in particular, has always been somewhat skeptical on this point, because data is much more valuable there than in the private sector.

To apply these principles out of the B2C world now in the B2B environment is in our eyes too short-sighted. We assume that the clarification of data ownership is a cornerstone of the success of Industry 4.0. With all the scandals from the B2C world, what is the incentive for an industrial company to give the data to its machine for example to a machine builder? The machine builder could, however, make good use of it. On the one hand, he could feed his product development with it and hopefully bring better products into the market. On the other hand, he would also be able to offer data-driven services. But why should an industrial company give the machine builder data for nothing?

So what is needed is clear regulation on data ownership. Unfortunately, this is not understood today by the legislator, the associations (Bitkom, VDMA, Platform I4.0) or the large IoT providers or they are deliberately excluded. A recent study by Accenture showed, however, that the exchange under defined rules is much more successful than isolation. 77% of the "masters" in the Accenture study state that they exchange data under clearly defined conditions. Until recently, protection/isolation was the core strategy for many manufacturing companies around the world, so it's certainly a big step if you have to think about "sharing" now. However, the history of Adler describes very well what the benefits are and better illustrates the cultural change needed to start sharing.  

What could a modern world look like?

Let's first separate the technical realization and the potential set of rules for further consideration.

In my eyes, an economy can only flourish properly if the participants have control over their "goods" and actions (basically, as in the past, why states put so much effort into the clarification of ownership). It is precisely this kind of control that is currently not assured for data. Legally, there is no clear definition of who owns data. That is why we should start here. More on that here:

You can find a short video about this on YouTube.

The first approach is that data belongs to the person who owns the associated physical object. This sounds good in the first approach, but in some instances, it leads to challenges. This is particularly the case when goods are transferred under retention of title (e.g., financed purchase, leasing). To find an answer to such cases, one has to take a closer look at the data structure. In our view, there are three categories of data: master data, instance data, and process data.

The Master data describes basic properties and can be equated with the information in a product catalog. This information is valid for all products of this type and is not limited to a single instance.

Instances are uniquely identifiable assets with a unique identification number. Instance data are concretizations of master data and are essential when a unique object belongs to a product, but due to subtleties in the production process, vital parameters are always somewhat individual. Example: In tools, the geometry data of a unique object varies. For example, the master data indicates that the drill has a diameter of 1.5 - 1.6 cm (master data). In the instance, for the concrete object, it is then recorded that the diameter is 1.543 cm (instance data). This accuracy is vital in the production process so that, e.g. CNC machines can calculate the compensations correctly and the workpiece is subsequently milled precisely.

Process data store concrete information from the production process. These are the data generated directly by an object. This means measured values, events, and results from the "life" of a machine, for example. They can be understood as tachographs or telemetry data of the machine. Parts of it can flow aggregated into the instance. For example, the cumulated operating hours.

If you now look at the topic on these three levels, our approach is as follows. The master data should always be the property of the manufacturer, who has sovereignty over product development and should therefore also have sovereignty over the digital counterpart. Instance data is the link between master data and process data. When someone buys a physical object, he should also become the owner of the digital instance. So when you buy, ownership of the physical good and the digital twin passes to the buyer. If this purchase is subject to a reservation or a leasing construct, the same applies to the digital twin. The buyer then becomes the possessor, but not the owner as long as the reservation conditions are not fulfilled. Or in the case of leasing, the lessor remains the owner, and the buyer always becomes "only" the possessor. This will work well because the instance is a real digital copy of the physical object but does not contain any further data of the buyer. Only the process data will include data of the buyer, i.e., the data generated by the object in its production process. This data should therefore always be the property of the buyer or the possessor. This is because he has produced them, regardless of whether he is the possessor or owner of the physical object.

This leads to the following approach:

  • Master data: Always owned by the manufacturer of the physical object

  • Instance data: Consequences of ownership of the physical good. So there can be a shared situation where someone is the owner, and someone is the possessor. Here it has to be clarified which rights the possessor has.

  • Process Data: Always owned by the possessor of the physical object

The following cases are still open:

  • What happens if the object is sold, does the new owner also get access to the process data of the previous owner?

  • Who owns aggregated (instance) data that result from process data. Example: The total runtime of an object?

  • What rights does the possessor have to an instance date if he is not the owner?

Regarding a) we suggest that the process data should not be transferred as they may contain productions information that may be critical. Also, the buyer buys the object and its functionality, not the production process information of the previous owner. Therefore only the instance data should be relevant for him because these contain all critical information about the object and refer to the superordinate master data. The instance data provides an overview of the status of the object and describes essential parameters that are influenced by the usage history. This means everything a buyer needs to know to make an informed decision about the purchase and what is required to make the best possible use of the object.

Regarding b) The owner should be able to determine this, and this will be clarified transparently during the sale. So if the lessor (owner) demands from the buyer (possessor) that the total usage and other parameters must be aggregated in the instance data in return for the financing, then this must be done. Because it is part of the business that the two have concluded with each other. As a rule, the lessor also has a justified interest, since he must know the wear and tear of the object to be able to optimize his risk.

Regarding c) We suggest that the possessor can use the data in the same way as he has a right of use for the physical object because this is the only way the possessor will be able to benefit from the digital twin. Only if he can do this, he will be willing to appreciate a digital twin and if necessary pay a surcharge to the manufacturer.

What does this mean for the company? Since he is either the owner or possessor of his physical objects, he has a clear basis. In any case, he remains the master of his process data, which is the critical data for the company, as it reflects their know-how. The manufacturer on the other hand also has a clear framework. In any case, he is master of his master data, just as he is master of his physical product definition. Both meet at the level of instance data. But there is also a clear relationship there. These are based on physical possession/ownership and are therefore easy to understand.

These clear relationships offer the right basis for new solutions. So a market participant can develop a service, e.g. an optimization. For this to work, he will probably need data. The possessor (for master and process data the same as the owner, for instance, data possibly different, but with the right to use data) can grant him access. Under which conditions the possessor grants access must then be clarified within the contractual relationship between provider and service user. Theoretically, various approaches are possible here:

  • The solution costs x € and to use it, access to the data has to be granted

  • The solution is free, but the solution provider gets access

  • The access should be chargeable, and the solution is chargeable

  • Only the access is liable to costs

In all these cases, it is essential that it is only a right of access to DELIVER the solution and allowance to use the data for any other intention.

But this is precisely where the technical challenge arises because once data has been transmitted, it is impossible to reclaim it technically guaranteed. This can only be done based on a set of rules. Because someone can say that he has deleted the data, but you cannot really control it. In the physical world, it is clear who is in possession of an object at what time, you only have to see where it is. With data, there can be copies, which are passed on without loss of quality.

What does this mean for the (technical) implementation?

Since it cannot be technically solved one hundred percent, there must be a mixture of technique and "guardian". This guardian can only be a neutral player who provides a common basis for cooperation. Therefore, it is the natural conclusion that there will be a technical platform to be operated by a neutral body. Here the old situation applies that all participants must trust the middleman for the whole thing to work. To have such an intermediary is undoubtedly not the best situation since one becomes a bit dependent and depends on the goodwill of the intermediary. However, we currently do not see any solution in DLT technologies that would allow replacing the middleman for industrial applications. However, future development remains very exciting, and we keep a very watchful eye on the developments!

Current blog posts

blog posts