In multimedia, dishes recognition is regarded as a difficult problem due to diverse appearance of food in shape and color because of different cooking and cutting methods. As a result, while there is a large number of cooking recipes posted on the Internet, finding a right recipe for a food picture remains a challenge. The problem is also shared among health-related applications. For example, food-log management, which records dairy food intake, often requires manual input of food/ingredients for nutrition estimation. This talk will share with you the challenge of recognizing ingredients in dishes for recipe retrieval. Finding a recipe that exactly describes a dish is challenging because ingredient compositions vary across geographical regions, cultures, seasons and occasions. I will introduce deep neutral architectures that explore the relationship among food, ingredients and recipes for recognition. The learnt deep features are used for cross-modal retrieval of food and recipes.
Chong-Wah Ngo is a professor in the Dept. of Computer Science at the City University of Hong Kong. He received his PhD in Computer Science from Hong Kong University of Science & Technology, and MSc and BSc, both in Computer Engineering, from Nanyang Technological University of Singapore. Before joining City University of Hong Kong, he was a postdoctoral scholar in Beckman Institute at the University of Illinois in Urbana‐Champaign. His main research interests include large‐scale multimedia information retrieval, video computing, multimedia mining and visualization. He is the founding leader of video retrieval group (VIREO) at City University, a research team that releases open source softwares, tools and datasets widely used in the multimedia community. He was the associate editor of IEEE Trans. on Multimedia, and has served as guest editor of IEEE MultiMedia, Multimedia Systems, and Multimedia Tools and Applications. He is on the steering committee of TRECVid and ICMR (Int. Conf. on Multimedia Retrieval). He was conference co-chair of ICMR 2015, program co-chairs of ICMR 2012, MMM 2012 and PCM 2013. He also served as the chairman of ACM (Hong Kong Chapter) during 2008-2009.
Title: Trustworthy Software and Automatic Program Repair
Software controls many critical infra-structures and a variety of software analysis methods have been
proposed to enhance the quality, reliability and security of software components. In this talk, we will first study the gamut of methods developed so far in software validation research - ranging from systematic testing, to analysis of program source code and binaries, to formal reasoning about software components. We will also discuss the research on trustworthy software at NUS which make software vulnerability detection, localization and patching much more systematic. We will specifically explore research on futuristic programming environments which enable auto-patching of software vulnerabilities, with a focus on automatic program repair - where software errors get detected and fixed continuously. This research aims to realize the vision of self-healing software for autonomous cyber-physical systems, where autonomous devices may need to modify the code controlling the device on-the-fly to maintain strict guarantees about trust.
Abhik Roychoudhury is a Professor of Computer Science at School of Computing, National University of Singapore. Abhik received his Ph.D. in Computer Science from the State University of New York at Stony Brook in 2000. Since 2001, he has been employed at the National University of Singapore. His research has focused on software testing and analysis, software security, and trust-worthy software construction. He has been an ACM Distinguished Speaker (2013-19). He is currently leading the TSUNAMi center, a large five-year long targeted research effort funded by National Research Foundation in the domain of software security. He is also the Lead Principal Investigator of the Singapore Cyber-security Consortium. His research has been funded by various agencies and companies, including the National Research Foundation (NRF), Ministry of Education (MoE), A*STAR, Defense Research and Technology Office (DRTech), DSO National Laboratories, Microsoft and IBM. He has authored a book on "Embedded Systems and Software Validation" published by Elsevier (Morgan Kaufmann) Systems-on-Silicon series in 2009, which has also been officially translated to Chinese by Tsinghua University Press. He has served in various capacities in the program committees and organizing committees of various conferences on software engineering, specifically serving as Program Chair of ACM International Symposium on Software Testing and Analysis (ISSTA) 2016 and General Chair of ACM SIGSOFT Symposium on Foundations of Software Engineering (FSE) 2022. He is currently serving as an Editorial Board member of IEEE Transactions on Software Engineering (TSE).
IEEE Fellow, IEEE Distinguished Lecturer, ONF Research Associate Editor-in-Chief, IEEE Communications Surveys and Tutorials
Distinguished Professor of National Chiao Tung University, Hsinchu, TAIWAN
Title: Network Cloudification: SDN-NFV and 5G-MEC with Edge and Fog Computing
The second wave of cloud computing, named network cloudification, in the forms of SDN (Software Defined Networking), NFV (Network Function Virtualization), and 5G-MEC (Mobile Edge Computing), is to centralize and virtualize networking into data centers. It enables operators to offer NaaS (Networking as a Service) with much lower CAPEX and OPEX with larger flexibility because devices become simpler, the number of administrators is less, and service orchestration is easier. It turns parts of communications currently done in hardware into computing done in software. However, the host of these data centers would not be Google-like super data centers as they are too far away from subscribers. The latency requirement of 10ms and 1ms decentralizes cloud computing down to edge and fog computing with CORD (central offices re-architected as data centers) and cellular base stations for SDN-NFV and 5G-MEC, respectively. In this talk, we first argue why, where and when SDN, NFV, 5G-MEC would prevail, and then illustrate how to make it happen with OpenFlow, SC (Service Chaining), NSH (Network Service header), etc. Then we examine how latency requirement dominates this virtualization game by listing key questions to answer in resource allocation in the architectures of SDN, NFV, and 5G-MEC. Their answers are mostly unknown now but would benefit the architects and developers of OpenFlow switches, SDN controllers, SDN-NFV apps, NFV data centers, MEC-enabled base stations, and operator’s infrastructure in general.
YING-DAR LIN is a Distinguished Professor of Computer Science at National Chiao Tung University (NCTU) in Taiwan. He received his Ph.D. in Computer Science from UCLA in 1993. He served as the CEO of Telecom Technology Center in Taipei during 2010-2011 and a visiting scholar at Cisco Systems in San Jose during 2007–2008. Since 2002, he has been the founder and director of Network Benchmarking Lab (NBL, www.nbl.org.tw), which reviews network products with real traffic. NBL became a certified test lab of the Open Networking Foundation (ONF) since July 2014. He also cofounded L7 Networks Inc. in 2002, which was later acquired by D-Link Corp. His research interests include design, analysis, implementation, and benchmarking of network protocols and algorithms, quality of services, network security, deep packet inspection, wireless communications, embedded hardware/software co-design, and recently network cloudification. His work on "multi-hop cellular" was the first along this line, and has been cited over 800 times and standardized into IEEE 802.11s, IEEE 802.15.5, WiMAX IEEE 802.16j, and 3GPP LTE-Advanced. He is an IEEE Fellow (class of 2013), an IEEE Distinguished Lecturer (2014-2017), and a Research Associate of ONF. He has served or is serving on the editorial boards of IEEE Transactions on Computers, IEEE Transactions on Sustainable Computing, IEEE Computer (Associate Editor-in-Chief), IEEE Network, IEEE Communications Magazine - Network Testing Series, IEEE Wireless Communications, IEEE Communications Surveys and Tutorials, IEEE Communications Letters, Computer Communications, Computer Networks, Journal of Network and Computer Applications, and IEICE Transactions on Communications. He is currently the Editor-of-Chief of IEEE Communications Surveys and Tutorials. He has guest-edited several Special Issues in IEEE journals and magazines, co-chaired symposia at Globecom’13 and ICC’15, and chairs workshops and symposia in Globecom'18 and Globecom'19. He published a textbook, Computer Networks: An Open Source Approach (www.mhhe.com/lin), with Ren-Hung Hwang and Fred Baker (McGraw-Hill, 2011). It is the first text that interleaves open source implementation examples with protocol design descriptions to bridge the gap between design and implementation.
Graduate School of Information Science and Technology
Title: Trend and applications of Big Data and IoT techniques
As people say "Data is the new oil," Big data is expected to make a large impact on our society and economics by mining hidden knowledge and rules from the data.
In particular, the structure of the real world data is changing from traditional relational data model to more generalized graph data model, as the web and social media are getting popular in the world. One of the most important technical challenges here is to efficiently analyze large graph data that express various types of relationship between people, items, and places. In this talk, we overview the trend of Big Data and IoT and then explain our research on distributed query optimization on cloud environment and efficient graph mining algorithms. Finally, we introduce some of our interesting applications of Big Data: 1) social network analysis by employing graph mining algorithms, 2) business data analysis by exploratory data analysis techniques, and 3) Smart route recommendation system empowered by IoT.
Makoto Onizuka is Professor at Department of Multimedia Engineering, Graduate School of Information Science and Technology, Osaka University. He received B.S. degree in computer science from Tokyo Institute of Technology in 1991. He joined NTT labs in the same year. Then, he received Ph.D. degree in computer science from Tokyo Institute of Technology in 2007. During 2000-2001, he was visiting scholar at University of Washington. He had been a distinguished technical member at NTT labs from 2010 to 2014, and a visiting professor at University of Electro-Communications from 2012 to 2014.
He developped LiteObject (object-relational main memory database system), pgBoscage (XML database system on PostgreSQL), XMLToolkit (XML stream engine and unix-like XML data processing tools), CBoC type2 (Common IT Bases over Cloud Computing at NTT). Now, His research focuses on cloud-scale data management and Big data mining.
Title: Heuristics for vehicle routing problems: Current challenges and future prospects
Vehicle Routing Problems (VRP) involve designing least-cost delivery routes to visit a geographically-dispersed set of customers. Over the past 60 years, this class of problems has been the subject of considerable work, summing up to thousands of articles. In 2017, we can reasonably say that the classical “capacitated” VRP (with only capacity constraints) is fairly well solved by metaheuristic techniques. Yet, the research on VRPs keeps on expanding even further, as a consequence of the increasing diversity of applications, which bring forth new difficult constraints, objectives, and combined decisions to account for customer’s needs, vehicle and network restrictions, and to better integrate VRP optimization in the decision chains. Moreover, with the advent of collaborative logistics, green initiatives, smart cities, multi-modal transport, in contexts where multiples stakeholders and conflicting objectives have to be considered jointly, or in the presence of dynamic problems with a short response time, the efficient resolution of these problems becomes even more critical.
In this talk, we will review some of the most challenging and recent VRP variants, and examine the heuristic solution techniques which are developed to tackle them. We will study the close connections between the structure of the problem decision sets, and the associated solution methods, showing how modern heuristics can effectively perform a search in a reduced space, defined by fewer groups of decision variables. Finally, a key challenge is to progress towards “unified” solution methods, which are not tailored for one single problem, but instead designed to solve a wide collection of problem variants with different constraints and objectives. For this purpose, we expose some of the main principles of the Unified Hybrid Genetic Search (UHGS), which has been recently extended to obtain state-of-the-art results --in a single code base-- for more than 50 difficult variants of vehicle routing and arc routing problems.
Thibaut Vidal is professor at the department of computer science of the Pontifical Catholic University of Rio de Janeiro (PUC-Rio), Brazil. Previously he was postdoctoral researcher at LIDS, Massachusetts Institute of Technology, USA. He obtained a joint Ph.D in computer science from the University of Montreal, Canada, and from Troyes University of Technology, France. His main domains of expertise relate to combinatorial optimization, metaheuristics, integer and dynamic programming, and convex optimization, with applications to supply chain management, resource allocation, signal processing, and machine learning. He has authored in the last five years over 20 articles in reputed international journals such as "Operations Research", "Transportation Science", "SIAM Journal on Optimization", "European Journal of Operational Research" and “Computers & Operations Research”. Prof. Vidal is the recipient of two TSL best paper awards, for the years 2014 and 2016, from the Transportation Science and Logistics Section of INFORMS, and the 2016 EJOR best paper award in the category “Theory and methodology”. He also received Young Researcher Prize of ROADEF - French Operations Research and Decision Support Society in 2013. His Ph.D thesis was mentioned in the honor list of the Dean of Postgraduate Studies at the University of Montreal and recompensed by several dissertation prizes. The source codes issued from his recent research are distributed in open source format. This is the case of the "Split" library, available at https://w1.cirrelt.ca/~vidalt/, and "HGS-CARP", a state-of-the-art heuristic solver for arc-routing problems, distributed at https://github.com/vidalthi/HGS-CARP/. Prof. Vidal has played the role of consultant for a number of industrial projects (approximately 800 hours). He also gave a talk at Amazon.com in 2015.
Assistant Chair (Graduate Studies), School of Computer Science and Engineering
College of Engineering, Nanyang Technological University, Singapore
Title: AI and Big Data analytics for health and bioinformatics
With the technological advances that allow for high throughput profiling of biological systems at a low cost. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this talk, I will start will the concepts in the analysis of big data, specifically the AI algorithms.
My group has in The Biomedical Informatics Lab (BIL) is a research Centre is the focus of the education, research and development, and human-resource training in heath informatics and bioinformatics at NTU. The mission of BIL is to provide the interdisciplinary environment and training for students and researchers to engage in leading and cutting edge research in bioinformatics, and thereby become a part of the life sciences workforce in Singapore and elsewhere.
This talk, by presenting selected research activities, will provide an overview of some of the innovative and creative approaches with the application of AI in big data analytics to address the challenges and solutions in both health and bioinformatics.
Dr. Kwoh Chee Keong is currently in the School of Computer Engineering since 1993. He received his Bachelor degree in Electrical Engineering (1st Class) and Master in Industrial System Engineering from the National University of Singapore in 1987 and 1991 respectively. He received his Ph.D. degrees from the Imperial College, University of London in 1995. His research interests include Data Mining and Soft Computing and Graph-Based inference; applications areas include Bioinformatics and Biomedical Engineering. He has done significant research work his research areas and published over 90 quality international conferences and over 30 journal papers. He has been often invited as a organizing member or referee and reviewer for a number of premier conferences and journals, including GIW, IEEE BIBM, RECOMB, PRIB etc. Dr. Kwoh is a member of The Institution of Engineers Singapore, Association for Medical and Bio-Informatics, Imperial College Alumni Association of Singapore (ICAAS). He has provided many service to professional bodies, and the Singapore and was conferred the Public Service Medal, the President of Singapore in 2008.