My research expertise focuses on GPU programming, robotics system, and parallel computing, supported by a strong foundation in artificial intelligence (AI) and machine learning. I am dedicated to advancing intelligent systems through rigorous scientific investigation. I have advanced proficiency in C++ and Python, leveraging libraries such as PyTorch, NumPy, and Pandas for data analysis and machine learning, alongside expertise in CUDA and ROCm for GPU-accelerated parallel processing.
My technical contributions encompass the development of high-performance parallel algorithms and the optimization of robotics systems. Notable projects include implementing a context-aware generative pretrained transformer (GPT) model using RAG framework LangChain and FastAPI, as well as designing a transformer architecture inspired by the seminal work "Attention is All You Need." I have also contributed to autonomous data management and perception systems for the Edinburgh Formula Student (EUFS) initiative, utilizing my skills in ROS2 and Linux to enhance real-time robust systems and parallel processing frameworks.
Outside of my technical pursuits, I stay active with regular gym sessions, running, and playing tennis and badminton. In my downtime, I enjoy reading books on productivity, psychology, and science fiction to inspire new ideas and refresh my perspective to the world. Additionally, my steam account is CyberAlbertCamus