The Smart Production Lab at CPI is a test bed for supply chain, production, and smart manufacturing algorithms. It is a fully functional laboratory, equipped with a variable speed conveyor systems, cameras and camera rigs, workstations, and user interface consoles. The objective of this setup is to use a scaled version of practical production environments to validate state-of-the-art algorithms for supply chain and smart manufacturing systems, developed by CPI’s team of graduate students and postdoctoral researchers.
Computer vision based process monitoring
Cameras are affordable, low-cost sensors, and computer vision algorithms have matured to a point where camera technology is ideally suited for process monitoring. A network of cameras have been set up in the smart production lab. This systems automatically collects and reports information about the flow of product over the conveyor system and the status of inventory at any given instant. Technical development is currently underway to extend the capabilities of our computer vision system to associate unique identifiers such as QR code with the product so that a live database of the production system can be maintained and made available to the process manager.
Operational metrics development and testing for supply chains
The conveyor and workstation setup in the smart production lab affords an excellent opportunity to simulate real supply chain environments. Our models have the ability to detect early stage disruptions in production systems and supply chains. This is possible due to the development of novel operational metrics which describe how well a system is using its available resources and raw material. Our models are validated using manual observation of production scenarios set up in the lab; or by using our camera network to provide the same information in real time.
Decision making model development and testing
Decisions about production processes need to be made in a timely manner and communicated to the person best placed to implement them at the right time. In the absence of such a guarantee, the best analytics and data gathering systems can do little to stop inefficiencies and delays. We have developed communication models using swim lane mapping and digraph theory which address the issue of timely and accurate communication. The ‘command center’ in the production lab is a multi-screen display which renders a live feed of the production process in the conveyor room, a real time video analysis, and graphical information from the operational metrics model. This information feed, when sourced to the appropriate decision maker as determined by the decision model, enables quick and accurate corrective measures to be taken to improve production output. The setup and design of the smart production lab facilitates a complete testing of our vision for smart production systems.
Robot-human collaborations and task assignments
Robots are becoming smarter and more capable of working in human populated environments and collaborating with their human leaders in such workspaces. Production tasks are often complex and are divided into subtasks using time and motion studies or process mapping. The optimal allocation of subtasks to humans and their robot collaborators is, as yet, poorly understood and formulated. We have developed swim lane techniques for process mapping which enable a qualitative study of the issue of optimal task allocation. We have developed mobile robotic ‘assistants’ which can be used to experimentally validate various task assignment paradigms. The production lab provides access to cameras, workstations, and other hardware which can be used for a complete study of task allocation in human-robot collaborations.
The Natural Interaction Lab in UTK’s Industrial & Systems Engineering department is engaged in using technology to understand human behaviors and gestures to help reduce the barrier and improve accessibility and availability in communications and operations, and to help reduce or even wipe off the learning curve. As a methodology to minimize the tension and error of user in the communication between human and system, the Natural Interaction Lab has create a series of applications deployed on computer, smartphones, and tablets, in the fields of healthcare, transportation, education, manufacturing, and many other interaction related areas.
Tour Globe is an entry-level usage of body motion sensor to allow user interact with Google Earth to view and navigate satellite scenes without touching. It includes a set of predefined actions and ability to expand them by a simple gesture coding system. This project supports a good start point for the knowledge of Natural Interaction.
Kids City uses virtual reality and responsive workbench technologies to learn how to walk on streets safely in a game environment. This created a natural way to teach or train that incorporate the participant into an intuitive practice, which will stimulate the engagement in self-directed learning and thus reduce the learning curve and increase learning efficiency.
The same techniques can be adapted in creating an intuitive virtual factory environment which allows employees to practice and stimulate their engagement in self-directed training before they are actually exposed to the real tasks and risks.
The train simulator project is comprehensive environment that can trace the operator’s behavior (fatigue, emotions, etc.), vital signs (such as blood pressure, pulse) and the train state variables (break pressure, throttle, speed, etc.) and how the behavior can affect safety risks. Data triggered monitor program can also serve the prediction of risk factors and provide alerts.
Quick Measure implements infrared camera and image processing to detect the dimensions of boxes quickly and uses the measurements and constraints to optimize the best packaging layouts. This measuring system eliminates the manual measurement operations and reduces the lead-time required to collect the data for further analysis.
This technology eliminated the manual measurement operations and reduced the lead-time required to collect the data for further analysis, created the capability that instantly provide the best layout solution in packaging or piling based on environment restrictions and dynamic items, and will greatly accelerate the whole manufactory process.
Hand Motion Profile
Hand Motion Profile aims to create a model to measure and predict the human hand capability and mobility. It comprises of working range of hand, bend angle of joints and speed in accomplishing tasks.
Modeling and prediction of this profile helps in understanding and estimating of the functionality of hand. It can help in design of workspace, assignment of tasks, and efficient task strategies.
Interactive technologies utilizing low-cost infrared sensors were used to develop inexpensive teaching and learning environments. They provided a natural way of learning by practicing and communication, and are good models for students to learn natural interaction technologies.
Smart Classroom aims to rid the faculty of any unnecessary load created by carrying a bulky laptop and pointing devices for the faculty. By taking advantage of in-door location tracking technology, the lecturer can simply store the course materials within a smart phone and access them on the interactive whiteboard through a touchable interface. After the class, the lecturer can choose to save the interactively written notes, save back to the smart phone, or send as email attachments to the students.