I am pleased to announce that I have successfully passed my Ph.D. Dissertation Defense in the Department of Electrical and Computer Engineering (ECE) at the University of Washington (UW). This has been a long and challenging journey, and I am grateful for the support of my supervisory committee, colleagues, sponsors, family, friends, and my faith.
Starting a new church can be a daunting task. However, when a group of Christ followers feel called by God to spread the Gospel, they know they must answer. That’s exactly what happened with the team at Waymaker Church.
As a software developer, I love to participate in hackathons to test my skills and knowledge, as well as to collaborate with fellow tech enthusiasts. One of the most exciting hackathons I have participated in is Code for the Kingdom (C4TK) - Seattle 2019, where my team won the People’s Choice Award.
I still remember the feeling of excitement and awe when I received the news that I had been elected to be a Session Elder of the University Presbyterian Church (UPC). It was an honor that came with great responsibility and a strong sense of duty to serve my fellow members and the church community as a whole.
On a beautiful day in Seremban, Malaysia, Macy Lee and I had our dream wedding. It was a day filled with love, joy, and the presence of God that we will cherish forever.
Love is a beautiful thing, and nothing symbolizes that more than a wedding ceremony. Macy Lee and I, two lovebirds who have been together for a while, decided to take our relationship to the next level by tying the knot in an intimate vow ceremony. We chose the beautiful rooftop lounge of Augusta Apartments in Seattle, WA, to exchange our vows in front of our church and lab friends.
Love is in the air, and when it’s time to take that big step, nothing is more exciting than the perfect proposal. For me, that moment came on Thanksgiving Day in 2021, at my friend’s house in Bothell, WA. I proposed to my girlfriend, Macy Lee, and I’m thrilled to say that she said yes!
Announcing the official release of the Metropolis Multi-Camera Tracking AI Workflow, featured in NVIDIA GTC’24 Keynote by Jensen Huang. This innovative solution accelerates the development of vision AI applications for large spaces, enhancing safety, efficiency, and management across various industries. Leveraging NVIDIA’s cutting-edge tools, this workflow offers a validated path to production, customizable AI models, and comprehensive support, enabling seamless development from simulation to deployment. Join us in transforming infrastructure and operations with advanced AI technology.
As the lead organizer of the AI City Challenge at CVPR, I’m excited to highlight our progress with NVIDIA Omniverse, which provided the largest indoor synthetic dataset for over 700 teams from nearly 50 countries. This dataset, essential for developing AI models to improve efficiency in retail, warehouse management, and traffic systems, included 212 hours of video across 90 virtual environments. Our global collaboration with ten prestigious institutions underscores the effort to advance AI for smart cities and automation. NVIDIA’s innovations, like Omniverse Cloud Sensor RTX, will further accelerate autonomous system development. Join the Omniverse community to stay updated and connected.
In the heart of the industrial automation revolution, the Metropolis multi-camera tracking system emerges as a beacon of innovation, seamlessly integrating with NVIDIA’s AI suite to redefine efficiency and safety in complex industrial settings. Developed by a pioneering software engineer, Metropolis creates a real-time, comprehensive map from hundreds of camera feeds, guiding autonomous mobile robots through intricate environments with unparalleled precision. This fusion of real-time AI and digital twin technology not only showcases the potential to drastically reduce operational downtime but also marks a significant leap forward in the quest for smarter, more responsive industrial ecosystems. Through this lens, we glimpse the future of automation, where digital precision and human ingenuity converge to create harmonious, highly optimized workplaces.
The Metropolis AI Workflows & Microservices 1.0 is officially live and is set to revolutionize the way enterprises and our ecosystem approach centralized perception across an array of matrixed sensors. One of the most exciting features of this release is the Multi-Camera Tracking app, which I had the pleasure of developing. The app is a reference architecture for video analytics applications that tracks people across multiple cameras and provides the counts of unique people seen over time. This is also known as Multi-Target Multi-Camera (MTMC) tracking.
NVIDIA has recently announced the release of the TAO Toolkit 4.0, which includes several exciting new features and enhancements. As a developer who has contributed to the toolkit, I’m thrilled to share my experience working on the people re-identification and pose-based action recognition networks, as well as the end-to-end video analytics pipelines on the Triton Inference Server.
I am excited to share my experience working on the Amazon One project, an innovative identity service that uses people’s palm for payment, entry, and more. As a member of the research team that developed and launched Amazon One, I had the opportunity to contribute to this groundbreaking technology in significant ways.
I am pleased to announce that I have successfully passed my Ph.D. Dissertation Defense in the Department of Electrical and Computer Engineering (ECE) at the University of Washington (UW). This has been a long and challenging journey, and I am grateful for the support of my supervisory committee, colleagues, sponsors, family, friends, and my faith.
As a software developer, I love to participate in hackathons to test my skills and knowledge, as well as to collaborate with fellow tech enthusiasts. One of the most exciting hackathons I have participated in is Code for the Kingdom (C4TK) - Seattle 2019, where my team won the People’s Choice Award.
Announcing the official release of the Metropolis Multi-Camera Tracking AI Workflow, featured in NVIDIA GTC’24 Keynote by Jensen Huang. This innovative solution accelerates the development of vision AI applications for large spaces, enhancing safety, efficiency, and management across various industries. Leveraging NVIDIA’s cutting-edge tools, this workflow offers a validated path to production, customizable AI models, and comprehensive support, enabling seamless development from simulation to deployment. Join us in transforming infrastructure and operations with advanced AI technology.
As the lead organizer of the AI City Challenge at CVPR, I’m excited to highlight our progress with NVIDIA Omniverse, which provided the largest indoor synthetic dataset for over 700 teams from nearly 50 countries. This dataset, essential for developing AI models to improve efficiency in retail, warehouse management, and traffic systems, included 212 hours of video across 90 virtual environments. Our global collaboration with ten prestigious institutions underscores the effort to advance AI for smart cities and automation. NVIDIA’s innovations, like Omniverse Cloud Sensor RTX, will further accelerate autonomous system development. Join the Omniverse community to stay updated and connected.
In the heart of the industrial automation revolution, the Metropolis multi-camera tracking system emerges as a beacon of innovation, seamlessly integrating with NVIDIA’s AI suite to redefine efficiency and safety in complex industrial settings. Developed by a pioneering software engineer, Metropolis creates a real-time, comprehensive map from hundreds of camera feeds, guiding autonomous mobile robots through intricate environments with unparalleled precision. This fusion of real-time AI and digital twin technology not only showcases the potential to drastically reduce operational downtime but also marks a significant leap forward in the quest for smarter, more responsive industrial ecosystems. Through this lens, we glimpse the future of automation, where digital precision and human ingenuity converge to create harmonious, highly optimized workplaces.
The Metropolis AI Workflows & Microservices 1.0 is officially live and is set to revolutionize the way enterprises and our ecosystem approach centralized perception across an array of matrixed sensors. One of the most exciting features of this release is the Multi-Camera Tracking app, which I had the pleasure of developing. The app is a reference architecture for video analytics applications that tracks people across multiple cameras and provides the counts of unique people seen over time. This is also known as Multi-Target Multi-Camera (MTMC) tracking.
NVIDIA has recently announced the release of the TAO Toolkit 4.0, which includes several exciting new features and enhancements. As a developer who has contributed to the toolkit, I’m thrilled to share my experience working on the people re-identification and pose-based action recognition networks, as well as the end-to-end video analytics pipelines on the Triton Inference Server.
I am excited to share my experience working on the Amazon One project, an innovative identity service that uses people’s palm for payment, entry, and more. As a member of the research team that developed and launched Amazon One, I had the opportunity to contribute to this groundbreaking technology in significant ways.