Following a discussion with a PLM manager at one of our largest french Defense & Security companies, I remembered discussions we had about the full-online or online/offline necessary sync that would lead the future of software and webapps a few years ago with friends. These friends were entrepreneurs working on a pretty cool iphone app called meetmytunes. Now one of them is working at Evernote and it’s clearly related to his position at the moment we had the discussion. It was the time (3 to 4 years ago) when the web was constantly growing in terms of coverage in France, 3G was already there and so the discussion was about. Are we going full-web or do we still need to care about the moments you may not be connected to the web?
The use case can be multiple. One of the latest I’ve heard of was when people were sent on the field to test systems like automatic speed detector on roads. They check all data and enter these results in systems that are not necessarily connected to any web network. So most of the time the answer is Excel or Access. The problem is that these solutions are not controlled enough to avoid data retyping and cleaning once the person is back in the company. More recently we’ve been discussing with some companies maintaining systems on boats. I don’t think that the Wifi password is written when you get in your room on a military ship (I could be wrong though).
Sync apps or just delayed-import
This problem of being sometimes disconnected has been answered by some software with a synchronization strategy. Many solutions are available to synchronize files like dropbox, owncloud,… and some others are used to synchronize content like Evernote for example.
If not synchronized automatically the old method is just to import the data once back at the company. This doesn’t have to be a complicated task it can be just an import button to click which would be somehow pretty much the same process after all.
Problem of data-structure update
The problem there is: what happens if the data structure changes in the company. How do you make sure you still enter data in the right format for the system they will get imported in? It might not be a big issue but you need to make sure that any evolution on the company hosted system should have an impact on these offline tools that are being developed. But the data modeling is an interesting topic in this case, we are willing to synchronize different type of information which can be composed of meta-data but also of files. And these elements may have different handling rules once back in PLM.
I’ve discussed this topic with various consultant working with some of the big editors and most of them told me “yeah we’ve done that for this customer, our software can do it”. But I haven’t had the same feedback from PLM consumers. So I’m not giving any direction to take for now. We know we work on it at Minerva to provide a standard solution still flexible to be used by any industry. How is your experience with Offline data management and its synchronization back to your PLM instance?