The Autonomous Systems Initiative (ASI) is a recently established initiative that aims to promote research and education in the challenging but highly rewarding area of autonomous systems. ASI will organize and sponsor, along with H2020 R&D Project MULTIDRONE, a series of events at IEEE ICIP 2018 (ASI@ICIP):
All ICIP 2018 participants are welcomed to attend, particularly the Plenary ASI and ASI steering committee meeting, where they can propose possible future activities related to ASI.
Unmanned Aerial Vehicles (UAV), commonly known as drones, have become nowadays useful for a plethora of services, such as scientific data collection, agricultural applications and, quite obviously, for media production. Drones’ potential in our domain is getting higher thanks to several winning features from both the motion and the shooting perspective. The MULTIDRONE project aims at developing an innovative and intelligent multi-drone platform for media production, mainly for outdoor event coverage. The scenario envisaged by the project is the one in which the director interfaces with the system platform communicating general and concise event coverage instructions following a well-defined multi-drone shooting taxonomy and expressed in a machine-readable metadata format. As a result of this Human-Computer Interaction (HCI), a mission plan is computed, consisting of feasible flight trajectories that comply with any relevant legal restrictions. MULTIDRONE methodology aims at streamlining the whole drone shooting process, spanning the entire continuum from eliciting editorial requirements, media production software/hardware specifications, algorithm research and development, system design, system implementation/integration, benchmarking/validation and end-user trials. In addition, it includes researching the necessary multiple drone AI-based active perception functionalities, as well as multi-view audio/video (AV) capture intelligence, targeting novel, robust techniques, capable of operating in real-world conditions. Research towards AV perception is to provide, both before and during production, a semantic world model for drone team navigation, AV production planning and execution, to track in real-time and in real-world conditions, for production purposes, individual shooting targets or target groups.
Alberto Messina started as a research engineer with RAI in 1996, when he completed his MS Thesis about objective quality evaluation of MPEG2 video. After starting his career as a designer of RAI's Multimedia Catalogue, he has been involved in several internal and international research projects in digital archiving, automated documentation, and automated production. His current interests are from file formats and metadata standards to content analysis and information extraction algorithms. R&D coordinator since 2005, he leads research on Automated Information Extraction & Management/Information and Knowledge Engineering, where he is author of more than 80 publications. He has extensive collaborations with national and international research institutions, in research projects and students tutorship. He has a PhD in Business and Management, with a specialisation in the area of Computer Science. He has been active member of several EBU Technical projects, and now he leads the EBU Strategic Programme on Media Information Management. He worked in many European funded projects including PrestoSpace, PrestoPrime, TOSCA-MP, IBC Award - winning VISION Cloud, BRIDGET and currently in MULTIDRONE. He has served in the Programme Committee of many international conferences, including Web Intelligence 2009-2013 and 2016, Machine Learning and Applications 2009-2013, MMM 2012, CIKM 2016. He’s ACM Professional member since 2005 and nominated Contract Professor of Multimedia Archival Techniques at Politecnico di Torino from 2012 to 2015. He actively participates in International Standardisation bodies, mainly in EBU and MPEG, where he contributed to MPEG-A, MPEG-7 and MPEG 21 extensions.
The Ocean Economy – the human use of the ocean – which already makes a very significant contribution to the world economy is likely to more than double by 2030. This puts the world’s seas and oceans under increasing pressure, both from the growing range and intensity of economic activities, as well as from climate change. Innovation holds the key to the economic success of the Ocean Economy and can help in reducing ocean health issues. Currently, an industrial revolution is unfolding under the seas. Rapid progress in the development of autonomous systems, robotics, maritime surveillance, satellite systems, AI, and data science are opening up whole new sectors of ocean use and research. In this work, we investigate the role that autonomous marine systems can play in harnessing the ocean economy’s potential in a responsible and sustainable way.
Gregory S. Yovanof, Ph.D. is the Managing Director of “STRATEGIS – Maritime Center of Excellence” and a Professor of Innovation & Entrepreneurship at Athens Information Technology (AIT).
His professional career includes twenty-year tenure in the high tech industry in the greater Asia-Pacific region, where he worked at the Research Labs of Eastman Kodak and Hewlett-Packard. He has also led the development of several award-winning ICs for the DVD market as a co-founder and executive manager at two start-up companies in Silicon Valley.
Dr. Yovanof has served as member of the Board of Hellas Online. He currently offers strategy consultancy services to a number of start-up companies and clusters of innovations in the ICT, Biotech and Maritime sectors.
Dr. Yovanof received a Ph.D degree in Communications from the University of Southern California (USC), Los Angeles, CA in 1988. A holder of four patents on imaging systems, Dr. Yovanof is a senior member of the IEEE.
The talk will present most recent Draper’s achievements in making autonomous systems more effective in helping humans in hazardous environments (aerial, ground, and underwater). Draper is a R&D company located in Cambridge Massachusetts, originally a spin off of MIT that (among many achievements) also designed the control system for landing the Apollo on the Moon back in 1969. In particular, it will focus on the importance of perception for autonomy, and describe (among others) the recent results from the DARPA Fast Lightweight Autonomy program with drones flying at up to 20 m/s in a GPS-denied environment only using a monocular camera. Underwater GPS-denied localization will be another key topic of discussion. Dr. Mariottini will also present his work on bio-inspired navigation for micro- drones in GPS-denied environments, and for creating light-weight semantic representations of the surrounding world.
Dr. Mariottini has more than 12 years of research and development experience in intelligent autonomous systems and has published more than 50 journal and conference papers in the areas of robotic perception, localization, and control. His research focuses in the areas of computer vision, robot control and machine learning with applications to unmanned aerial and ground robots, swarming, and assistive technology for medical applications.
Dr. Mariottini is currently a Principal Member at Draper Laboratories (Cambridge, MA, USA) when he joined in early 2016. From 2010 to 2016 he was an Assistant Professor at the University of Texas at Arlington (Dept. of Computer Science and Engineering) where he created the ASTRA Robotics and Vision Lab. Dr. Mariottini was the organizer of the International Workshop of Surgical Vision, the Robotics in Assistive Environment Workshop, the Computer-Assisted Robotics and Endoscopy workshop. He has been working in collaboration with many multi-disciplinary collaborators (UT Southwestern Medical Center, Vanderbilt University, and UNT Health Science Center, Univ. Rome La Sapienza, Univ. Siena, Leeds Univ. etc.), and received support from government agencies (NSF, DARPA, DoE) as well as by industry partners.
Dr. Mariottini received his Ph.D. degree in Electrical Engineering from the University of Siena, Italy, in 2006 and held PostDoc positions with Kostas Daniilidis (GRASP Lab, UPENN), Frank Dellaert (GeorgiaTech), and Stergios Roumeliotis (University of Minnesota).