Encraft AI
At Encraft AI, we partner closely with enterprise clients to design secure, scalable, and practical AI solutions. Our mission is to blend deep technical expertise with business-first thinking, ensuring every solution we deliver drives measurable impact and aligns with our clients’ strategic goals.
Role Summary
As an SDE I (Backend) Intern, you’ll join our core engineering team to design, develop, and deploy backend systems and data pipelines that power our AI products.
Key Responsibilities
- Design, build, and maintain Python-based backend services (FastAPI, Flask, or Django).
- Develop and manage data pipelines for ingestion, transformation, and orchestration of structured/unstructured data.
- Integrate APIs, data sources, and internal microservices to support AI workflows and data products.
- Implement and maintain deployment pipelines using Docker, CI/CD tools, and cloud platforms (AWS/GCP/Azure).
- Ensure scalability, reliability, and performance of backend and data systems.
- Collaborate closely with team members to deliver end-to-end solutions.
- Write clean, maintainable, and testable code with clear documentation and participate in code reviews.
- Contribute to architecture discussions and engineering best practices.
Required Qualifications and Skills
- Bachelor’s degree in Computer Science, Information Technology, or other relevant engineering fields.
- Strong programming skills in Python, with a focus on backend development.
- Familiarity with API development, RESTful design, and microservice architectures.
- Understanding of data engineering workflows – ETL, pipeline creation, and data management.
- Experience with databases (PostgreSQL, MySQL, MongoDB, etc.).
- Knowledge of containerization (Docker) and CI/CD tools (GitHub Actions, Jenkins, etc.).
- Good understanding of cloud deployment and DevOps fundamentals.
- Strong analytical, debugging, and problem-solving skills.
- Effective communication and teamwork in a fast-paced, startup environment.
Nice-to-Have Skills & Experiences
- Experience with FastAPI or Flask for production-grade applications.
- Familiarity with data orchestration tools (Airflow, Prefect, Dagster, etc.).
- Exposure to AWS/GCP data services (S3, Lambda, BigQuery, etc.).
- Understanding of message queues (Kafka, RabbitMQ) and event-driven systems.
- Experience with testing frameworks (Pytest, Unittest).
- Prior internship or significant project experience in backend or data-engineering roles.
- Experience with RAG systems, prompt engineering, context engineering, vector databases
What You'll Get
- Opportunity to work on real AI and data products from day one.
- Mentorship from senior engineers and exposure to the full AI project lifecycle.
- Access to a modern tech stack with focus on automation, scalability, and reliability.
- Flat hierarchy, collaborative culture, and ownership-driven environment.
- A chance to shape scalable, impactful AI solutions at the intersection of data and engineering.
Key details
- Joining: Immediate
- Location: Fully remote
- Experience: 0 - 2 years. Final year students can also apply
- Compensation: 30,000 INR - 50,000 INR / month
- Role: 6 month internship with opportunity to extend to FTE
Apply
Please fill in the following form to apply for the job opening
You must be logged in to apply.