FlatironsAI, Remote, United States
As a core team member of the organization, I played a key role in driving the development of enterprise-level generative AI applications. I steered product management, ensuring seamless design, efficient issue resolution, and strategic feature prioritization.
To further enhance the functionalities, I modernized metadata entity extraction with advanced techniques for few-shot learning, in-context learning, and recursive reasoning. Those inventions allowed for proper classification, ranking, and thus optimal performance for complex enterprise use cases.
I also engineered the AI frameworks using cutting-edge transformers, Large Language Models, and Retrieval-Augmented Generation. Further, I integrated these modern technologies to enable efficient ingestion, categorization of entities, and vector-based indexing for business applications.
During the project, I developed my skills in Azure and OpenAI platforms, working with a dynamic startup team to deliver weekly enhancements to production. This was done to ensure timely product launches and laid the foundation for scalable, impactful AI-driven solutions.
Virufy, Remote, United States
As part of the GenAI development team, I contributed to building a machine learning model using Python and TensorFlow to detect COVID-19 through cough sounds. By applying advanced feature extraction and transfer learning techniques, the model achieved high accuracy. I also designed and implemented robust data pipelines to preprocess and extract features from audio data, ensuring optimal input quality for large-scale deployment on AWS and Azure.
To further enhance the model, I integrated Generative AI (GenAI) and Large Language Models (LLMs) to explain AI outputs. By utilizing Retrieval-Augmented Generation (RAG) techniques, I enabled the model to retrieve relevant medical literature, providing healthcare professionals with context-aware explanations for better decision-making. This integration helped improve patient outcomes and the model's interpretability, creating a valuable tool for healthcare applications.
Deque Systems, Boulder, CO, USA
As the team lead for a capstone project at Deque Systems, I spearheaded an initiative focused on enhancing digital accessibility through the refinement of their Axe Advisor AI tool. My primary responsibility was guiding a cross-functional team to leverage data from Deque University to improve the learning experience for users. This involved leading the development of custom algorithms in Python to conduct in-depth analysis, helping optimize course design and personalize learning paths for users with diverse needs.
A key aspect of my role was overseeing the implementation of data pipelines that integrated data from various sources, including Google Analytics, SQL databases, and REST APIs. By ensuring the seamless flow of data, we were able to gather valuable insights on user engagement and course completion times. This led to strategic redesigns of several courses, directly improving user satisfaction and learning outcomes. I also worked closely with key stakeholders to ensure that all improvements aligned with WCAG accessibility standards, ensuring that the platform became more inclusive.
Throughout the project, I was able to combine my technical expertise in machine learning and AI with strong leadership and collaboration skills. I led the team through challenges and implemented innovative solutions that improved accessibility and the overall user experience on Deque’s platform. The success of this project has greatly contributed to my passion for using data science to solve real-world problems, particularly in the realm of digital accessibility.
Tata Consultancy Services, Hyderabad, India
As a member of the converged and hyper-converged support team, I had the privilege of overseeing customer infrastructures across multiple regions. With approximately 50 domains under my care, I focused on delivering top-notch support and optimizing performance.
One of my significant contributions to the team was reducing Incident MTTR by 40%. Through meticulous data analysis and the development of automation scripts, I streamlined processes and expedited ticket resolution.
Harnessing the power of Python, I designed and implemented scripts that interacted with a third-party ticketing tool. By leveraging statistical models on the data fetched from the tool, I proactively identified potential risk areas within customer environments. This allowed us to take preventive measures and mitigate risks before they could impact the stability of their infrastructures.
Furthermore, I provided strategic recommendations to our customers. By suggesting the unification of their dispersed infrastructures into a single SaaS platform, I envisioned potential cost savings of €129k per year.
Eckovation Solutions, New Delhi, India
During my ML internship, my research was on Deep Learning and Neural Networks, where I focused on advanced methodologies and practical implementation using frameworks like TensorFlow, PyTorch, and Keras. I developed a custom Convolutional Neural Network (CNN) for handwritten digit recognition, leveraging Python, NumPy, and Matplotlib for data preprocessing, visualization, and performance analysis. Through data augmentation and hyperparameter tuning, I significantly improved the model’s generalization ability and accuracy.
Collaborating with senior researchers, I continually refined the CNN architecture to optimize its performance. The project was thoroughly documented, covering model architecture, training processes, and experimental outcomes. I presented my findings to key stakeholders, showcasing the model's potential for real-world applications, including its scalability and ability to handle diverse datasets.