Events

Building effective RAG applications for public good
ISTD COIL Seminar by Isaac Lim – Join Isaac, a Data Scientist from GovTech’s AI Practice, as he shares his experience empowering developers in government to build effective RAG (and more generally, LLM-based) applications.

Immanuel M. Bomze (University of Vienna) – Need to relax – but perhaps later? Reflections on modeling sparsity and mixed-binary nonconvex optimization problems

Bridging the gap: navigating the transition from design school to UX design in the real world
ISTD COIL Seminar by Samath Aravinda & Imran Nordin – In this talk, we will delve into my personal journey as a UX designer, highlighting the essential design processes that drive successful projects. We will discuss the myriad opportunities available in the field, from emerging technologies to innovative methodologies, and how these can shape your career trajectory. To enrich our discussion, I am excited to introduce my colleague, a seasoned UX researcher, who will present for 15-20 minutes on the UX research process. He will cover key methodologies, tools, and techniques used to gather user insights that inform design decisions.

LLMs for out-of-domain use cases
ISTD COIL Seminar by Anirudh Shrinivason – This talk will “delve” into Cohere’s strategies for harnessing the potential of Large Language Models (LLMs) within the enterprise domain. By exploring fine-tuning techniques and the integration of Retrieval Augmented Generation (RAG), we aim to showcase how these methods can be tailored to solve complex, industry-specific challenges.

Dr. Kishor Bharti (IHPC) – Quantum Machine Learning: What’s Happening, What’s Hard, and What’s Next


Jonathan Scarlett (National University of Singapore) – Exploration and Optimization of Black-Box Functions with Safety Guarantees
Jonathan Scarlett (National University of Singapore) – Exploration and Optimization of Black-Box Functions with Safety Guarantees


Introduction to modern LLM pre-training: Sailor use case
ISTD COIL Seminar by Qian Liu – This talk will present some key techniques in modern LLM pre-training, including scaling laws, data quality engineering, data mixture optimization, and efficient training strategies. We will use Sailor, a family of open language models (0.5B to 14B parameters) tailored for South-East Asian languages, as a case study.


Alessandro Luongo (National University of Singapore) – Quantum Algorithms and Their Applications in Finance


Baby Shark Fund Open Pitch
Baby Shark Fund Open Pitch

Fair generative modelling
ISTD PhD Oral Defence Seminar by Teo Tzu Hsuan Christopher – In this dissertation, we make important contributions in improving fairness in generative models by identifying and addressing constraints which may limit their broader adoption.