Hi, I'm Mauricio
from Bolivia
now based in Spain
Data Scientist and AI enthusiast with a strong physics foundation, self-driven learning to develop and deploy innovative AI solutions.


May 2024- Nov 2024
Hogarth Worldwide
AI Developer Trainee
- Designed and developed an advanced conversational agent using LLM. Utilized Mistral as the base model, integrating it with the Langchain and DSPy frameworks. This agent includes a categorization and autocorrection layer that ensures response accuracy and has the capability to query an SQL database to retrieve correct numerical information.
- Supported the Content AI team by performing multiple face-swaps, which were implemented in advertising campaigns based exclusively on AI-generated images, enhancing content personalization and visual impact.
- Worked on the development of an automatic subtitling system using Fast-Whisper as the main transcription model. Contributed to the creation of simultaneous and multilingual subtitles, improving the accessibility and reach of audiovisual content.

Jun 2023- Oct 2023
Pisos Mamut
Data Scientist Internship
- Collaboration with the sales and production teams to define KPIs and key metrics for measuring environmental impact
- Performed descriptive and statistical analysis of sustainability data stored in Excel databases from 2019 to 2023.
- Used data analysis metrics like Power Query to obtain valuable information on the impact of activities on sustainability.
- Developed 3 interactive dashboards with Power BI to visualize the impact of activities on sustainability in real-time.

Oct 2023- Oct 2024
Universidad Europea de Madrid
MS. Big Data
In my final project:
- I integrated three advanced AI models GroundingDINO, SAM, and Stable Diffusion.
- The goal was to develop a prototype image editor that demonstrates the combined capabilities of these models.

Aug 2016 - Nov 2022
Universidad Mayor de San Andres
BS. Physics
As a research assistant I developed my final project:
- Studied the dynamics of the Lorenz-96 attractor
- Employed a Long Short-Term Memory (LSTM) for the prediction task, which is particularly well-suited for long-term predictions due to its architecture.
- Applied a hybrid model that incorporates empirical values into the LSTM framework
- The hybrid LSTM achieves temporal predictions that are 2.82 times more accurate than the standard LSTM model
Projects & Competitions

RAG Mastery
Dedicated to mastering Retrieval-Augmented Generation (RAG) and Agents, covering topics such as Simple RAG, RAG with chat history, RAG with a Vector Database, Self- Corrective RAG, and advanced Agent techniques.e.

Sentiment analysis SB-LVII
Analyzed the sentiment of comments about the 2024 Super Bowl using HuggingFace's pre-trained Transformer based (RoBERTa) model, revealing that over 56% of the comments were positive and the rest where negative or neutral.


Stable Diffusion - Image to Prompts
Trained 3 different models. I used a regression-KNN, as well as BEiTv2-large and ViT-large-384, which are pre-trained models based on Transformers neural networks.