This short course on Cloud/Edge Computing for Deep Learning and Big Data Analytics provided a comprehensive overview and detailed presentation of advanced technologies utilized in distributed computational systems.
Distributed computing plays a critical role in today’s data-driven landscape, enabling the processing of vast amounts of data efficiently and effectively across multiple nodes and locations.
Through the distribution of computing tasks across systems, distributed computing amplifies scalability, reliability, and performance, proving indispensable for managing big data analytics, machine learning, and real-time processing tasks.
The course consisted of six lectures and one workshop, covering some of the most popular tools necessary for distributed systems. Docker and Kubernetes streamline distributed computing by containerizing applications and automating deployment and management across clusters of servers. The Orion Context Broker facilitates seamless communication and data sharing between distributed systems, while Apache Airflow automates workflow scheduling, enhancing the efficiency of data processing pipelines. Together, these technologies empower organizations to leverage distributed systems fully, driving innovation and maximizing the value of their data.
Additionally, a lecture on decentralized DNN application provided a research-oriented view of multi-agent learning systems. Finally, the workshop combined all the aforementioned tools into a practical application.