Gen AI Development Intern
USA - Remote
SonicWall
Defend SMBs, enterprises and governments from advanced cyber attacks with SonicWall's award-winning firewalls and cyber security solutions.SonicWall is a cybersecurity forerunner with more than 30 years of expertise and is recognized as a leading partner-first company, ensuring our partners and their customers are never alone in the fight against cybercrime. With the ability to build, scale and manage security across the cloud, hybrid and traditional environments in real-time, SonicWall provides relentless security against the most evasive cyberattacks across endless exposure points for increasingly remote, mobile and cloud-enabled users. With its own threat research center, SonicWall can quickly and economically provide purpose-built security solutions to enable any organization—enterprise, government agencies and SMBs—around the world. For more information, visit www.sonicwall.com or follow us on Twitter, LinkedIn, Facebook and Instagram.
Description of the Intern Qualification
Course work in AI and Python programming, passion for Gen AI / LLMs, cost and performance optimization mindset, good attention-to-detail and perseverance in bringing out the best results via experimentation and analysis
Topic
Gen AI Development
Background
SonicWall is developing a Gen AI Chatbot to offer AI-backed next generation user experience and convenience in network device and data monitoring and management to its customers. The Chatbot uses marketplace LLMs and Vector databases as part of the technology stack. The Gen AI landscape is fast changing with new LLM models being released frequently, while the operational cost of LLMs is a major component of running the application.
Objective
Cost optimization
Scope of Work
Token optimization: The intern would compare different static and dynamic extraction techniques to reduce the number of LLM tokens in a given request made to the LLM, improvising on the current numbers serving as a baseline. This would result in cost savings as LLM cost is the main component of the operational cost of the application and the token count directly contributing to it.
LLM Models: The application currently uses Claude 3.5 Sonnet v2 as the LLM model provided by AWS Bedrock. AWS releases additional models from time to time. The intern would experiment with additional model(s) to analyze and provide report on their suitability to the application.
Expected Deliverables
- Reduction in LLM tokens
- Comparison report of newer LLM model(s) to existing model in terms of accuracy, cost and latency of LLM responses
Technologies and Tools
Python, AWS Bedrock, Gitlab
#LI-KB7
#LI-Internship
#LI-USA
#LI-GenAIIntern
#LI-GenAIChatbot
SonicWall is an equal opportunity employer.
We are committed to creating a diverse environment and are an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, ethnicity, religion, sex, gender, gender identity and expression, sexual orientation, national origin, disability, age, marital status, veteran status, pregnancy, or any other basis prohibited by applicable law.
At SonicWall, we pride ourselves on recruiting a diverse mix of talented people and providing active security solutions in 100+ countries.
Tags: AWS Chatbots Claude Generative AI GitLab LLMs Privacy Python Research Security
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.