It's not just since AutoLab moved into the EBS Center and our company activities have been taking place directly on the TU Graz campus that I've been actively involved in both industry and science. My master thesis, together with TU Graz, led to a patent application for the automated treatment of thyroid diseases. I have therefore been involved in entrepreneurial projects and scientific research in parallel for a long time.
In my role as Managing Director of AutoLab, I am regularly faced with tough time, cost and innovation pressures in the manufacturing industry. At the same time, I have a deep insight into the methodical research work of application-oriented basic research. A look at both worlds reveals to me at first hand where each side has its strengths and weaknesses - and what both can learn from each other.
What industry can learn from research
First-Principles-Thinking
Whether in industry or research, solutions must always be based on a logical foundation. Unfortunately, however, companies often rely on dogmas - rules that claim to be unquestionable truths. Far too often, decisions are not based on arguments, but rather the law of the most experienced applies, and so decisions are ultimately made as they have “always been done”. This not only kills innovation, but also leads to bizarre practices. For example, measuring planes are defined with only two reference points, although three are mathematically necessary to define a plane. The impossible then becomes possible by adjusting differences with dozens of offsets and numerous measurement reports until all sides measure 'the same'. A fundamental realization that a plane needs three measuring points can save hundreds of hours of time and money. The time for first-principle thinking is negligible in comparison, as significant logic errors can usually be identified in just a few hours.
An unbiased approach
It also becomes problematic when processes and tools are so rigid that new methods and approaches have no chance. Of course, there is greater cost pressure in companies and it is not possible to jump on every innovation immediately. Nevertheless, you should remain open to new ideas. Often, however, the same tools or frameworks are used again and again - simply because someone already knows them - instead of questioning what is really best suited.
We are currently redesigning our Vision app. We are therefore open to all technologies and are examining how we can design the new front end. Our machine vision tech lead put it very clearly: “We are open to any framework, but only if there are technical arguments for it. >Because I have experience in XY< doesn't play a role here.” A team that only thinks within its own horizon of experience and always selects tools based on familiarity will inevitably be overtaken at some point. Following first-principles thinking, this is unavoidable.
Development freedom
I have already supervised several Master's theses, and the construct often offers considerable individual freedom. The same applies to many doctoral theses: You write down the project on a slide or a single A4 page, and then the researcher spends months or years researching for results. Supervision is usually provided in the form of rather loose feedback - and yet this often produces great results.
It is obvious that such a loose approach is not suitable 1:1 for the business world with its hard deadlines. However, the opposite is often the case there: tight micromanagement, which has long since become standard in many companies. I think it's extremely important to be able to trust your developers so much that you don't have to micromanage them, but instead give them a lot of personal responsibility and creative freedom.
However, I am also convinced that you cannot achieve this simply by reading a Scrum book once and then introducing 'agile development'. It stands and falls with bringing the right developers into the team. I am also convinced that the agile approach simply cannot and will not work for many personalities. I myself had to learn a lot along the way and am glad that we now have a great team at AutoLab where everyone trusts each other 100%. This also creates a lot of space for creative development and innovation.
What research can learn from industry
Efficiency and scaling
It is often extremely impressive to see the technical innovations and approaches that result from a Master's thesis. Unfortunately, these are usually hardly usable for others. The wide scope for implementation often leads to catastrophic code bases - and nowadays almost all technical work, not just in computer science, consists to a large extent of software.
The result is often an unmaintainable code base with little or poor documentation and a monolithic framework. Common practices such as unit testing are almost always missing, which means that the code is still full of minor (and major) bugs. As soon as further work is to be based on this, the new employee or researcher must first spend weeks or even months learning the monolithic framework. In most cases, a huge amount is rewritten.
The time that is lost as a result is enormous and could be better invested in actual research. However, the prerequisite for more efficient collaboration is an awareness of clean code quality and collaborative, agile working. However, isolated solutions are often created in which only a single PhD or Master's student can find their way around. In my opinion, somewhat stricter guidelines for implementation could prevent this without restricting research freedom.
Project management
In my opinion, agile project management, such as Scrum, would be ideal for many research activities. Tools like JIRA or Confluence could help with this. Of course, at the end of the day, there are numerous papers and works that describe the exact methodology and derivations. However, a knowledge wiki for the tools used and frameworks created is often missing. The description of the 'nitty gritty' - for example, how to use newly developed software libraries - is often lost. This means that important knowledge is lost and an unnecessary amount of time is wasted.
With open research questions, it happens that people get lost in many side issues. In my opinion, the definition of concrete tasks and sprints could significantly increase the “ pull to the target” and thus also the scientific output. As agile frameworks are designed to allow a lot of freedom anyway, they would not appear too rigid to prevent new findings or insights.
Focus on results
In my circle of friends, too, I have often experienced that some people lose themselves in false perfectionism when writing academic papers. What exactly is the definition of a “perfect paper”? What criteria would be used to measure the text? Of course, a scientific paper must meet high quality standards, but none will ever be completely perfect.
There is usually always a deadline or some form of time pressure in business. I think everyone has had the experience that no matter where you set the deadline, the project will be finished around that time. And with every day that the deadline gets closer, your own productivity usually increases rapidly.
I am convinced that freedom in research is very valuable, but clearly defined content and time targets could significantly improve the process. I would recommend looking up Parkinson's Law.
autolab digithy forschung industrial automation machine vision medtech projekt management software development
Leave a Reply