List of Open Positions

On-site
Delhi, Noida, Uttar Pradesh, India
Posted 1 week ago

Job Overview

We are seeking an experienced Backend Java Developer with strong hands-on expertise in Java 8/9, Core Java, backend integrations, and modern engineering practices. The ideal candidate should have solid design pattern knowledge, experience working on enterprise-grade integrations, and exposure to cloud and DevOps tools.


Primary Skills (Must Have)

  • Java 8/9, Core Java

  • Design Patterns (beyond Singleton/Factory)

  • Web Services Development (REST & SOAP)

  • XML & JSON processing

  • CI/CD pipelines


Secondary Skills (Good to Have)

  • Jenkins

  • Kubernetes

  • Google Cloud Platform (GCP)

  • SAP JCo Library

Optional Certifications: OCPJP, Google Professional Cloud Developer


Required Experience

  • Strong experience in integration component development using Java 8/9 & SOA

  • Deep knowledge of integration architecture and design patterns

  • Hands-on experience with tools such as Eclipse, NetBeans, JIRA, Confluence, Bitbucket, SVN

  • Exposure to modern IT trends: Cloud Platforms, DevOps, Low-Code, Intelligent Automation

  • Experience with GCP is an added advantage

  • Preferably worked on 3–4 integration adapters or connectors for ERP/CRM/HCM/SCM/Billing applications


Behavioral Competencies

  • Experience working with US/Europe clients in onsite–offshore delivery model

  • Strong verbal & written communication, documentation, and presentation skills

  • Excellent analytical and problem-solving abilities

  • Effective task prioritization, time management, and stakeholder management

  • Quick learner and strong team player

  • Ability to work under tight deadlines in a matrix environment

  • Demonstrated Organizational Citizenship Behavior (OCB)


Job Responsibilities

  • Write design specifications and user stories for assigned modules

  • Develop backend components/classes and support QA teams with test cases

  • Maintain coding best practices and conduct peer code reviews

  • Participate in Scrum ceremonies (Daily Standup, Planning, Retro, Demos)

  • Identify technical/design/architecture risks and create mitigation plans

  • Adhere to prescribed development processes and documentation standards

  • Collaborate with Architects and cross-functional teams on technical issues, demos, POCs, and proposals

  • Contribute to internal knowledge repositories, reusable accelerators, and IP creation

  • Mentor junior developers and provide constructive feedback

  • Deliver internal training on new technologies

  • Participate in organizational initiatives—interviews, CSR activities, events, policy implementation

Job Features

Job Category

Full Time

Job Overview We are seeking an experienced Backend Java Developer with strong hands-on expertise in Java 8/9, Core Java, backend integrations, and modern engineering practices. The ideal candidate sho...View more

Remote
California, USA
Posted 3 weeks ago

Responsibilities: 

Kafka Cluster Management: 
  • Install, configure, and maintain Confluent Kafka clusters in an on-premises environment. 
  • Manage Kafka brokers, Zookeeper, Connect, Kafka Streams, and related services. 
  • Monitor and tune the performance of Kafka clusters to meet operational SLAs. 
Security & Compliance: 
  • Implement and manage Kafka security including SSL/TLS, ACLs, role-based access control (RBAC). 
  • Ensure that Kafka infrastructure complies with the organization's security policies and best practices. 
Monitoring & Alerting: 
  • Set up monitoring and alerting for Kafka clusters using tools like Prometheus, Grafana, or Confluent Control Center. 
  • Analyze logs and metrics to proactively detect and resolve issues. 
Automation (Good to have): 
  • Develop automation scripts and frameworks to streamline Kafka operations, such as provisioning, configuration, scaling, and monitoring using tools like Ansible, Terraform, or custom Python scripts. 
  • Automate Kafka cluster deployments, upgrades, and patch management. 
  • Implement auto-scaling, failover, and disaster recovery solutions through automation. 
  • Knowledge of CI/CD pipelines and infrastructure-as-code (IaC) concepts for Kafka deployments.
Maintenance & Upgrades: 
  • Regularly perform maintenance and upgrades for Kafka components (brokers, Zookeeper, etc.). 
  • Handle Kafka backup and disaster recovery procedures. 
Troubleshooting & Support: 
  • Provide operational support and incident management for Kafka-related issues. 
  • Work closely with developers and data engineers to resolve application-level issues. 
 Documentation: 
  • Maintain accurate documentation of Kafka architecture, configuration, and operational procedures. 
  • Create runbooks for various Kafka administrative tasks. 
 Experience & Skills needed: 
  • 3-5 years of hands-on experience managing Confluent Kafka clusters in production environments, preferably on-premises. 
  • Solid understanding of distributed systems, high availability, and failover mechanisms. 
  • Experience with Kafka Cluster Linking and cross-cluster replication. 
 Location/Work Timings/Travel 
  • Remote Working position – preference for resource to be based in US. 
Work Timings US CDT :   
  • At least six hours coverage for US CDT – for resources based in India. 
  • Travel required – At start of the engagement  
 

Job Features

Job Category

Full Time

Responsibilities:  Kafka Cluster Management:  Install, configure, and maintain Confluent Kafka clusters in an on-premises environment.  Manage Kafka brokers, Zookeeper, Connect, Kafka Streams, and ...View more

Remote
Delhi, Kolkata
Posted 4 weeks ago

Role Overview: 

We are seeking an experienced Lead Data Analyst with deep expertise in Customer Data Platforms (CDP)Agentic AIdata management, and marketing analytics. This role focuses on enabling data-driven decision-making and optimizing customer engagement through advanced analytics, AI, and integrated marketing data ecosystems. The ideal candidate has strong technical and analytical skills, extensive experience in Adobe AnalyticsROI measurement, and customer loyalty analysis, and the ability to translate complex data insights into actionable business strategies. As a Data Analyst Lead, you will serve as the technical backbone for cross-channel data integration, campaign measurement, and AI-driven personalization, ensuring seamless collaboration between marketing, product, and data engineering teams.  

Key Responsibilities: 

  • Lead the design and implementation of data models and pipelines supporting marketing analytics and customer intelligence. 
  • Integrate and manage data from Customer Data Platforms (CDPs), CRM, and marketing automation tools to create a unified customer view. 
  • Leverage Agentic AI and Gen AI frameworks to automate analytical workflows and drive advanced insights. 
  • Analyze marketing performance across channels to measure ROI, customer engagement, retention, and lifetime value. 
  • Build scalable dashboards, reports, and visualizations for marketing and executive stakeholders using tools like Tableau, Power BI, or Looker. 
  • Develop and apply machine learning models for segmentation, churn prediction, personalization, and recommendation systems. 
  • Drive experimentation and A/B testing to evaluate marketing effectiveness and optimize performance. 
  • Ensure data quality, governance, and compliance across data systems, analytics platforms, and pipelines. 
  • Partner with Marketing, Product, Data Science, and Engineering teams to define KPIs, optimize customer journeys, and enhance customer loyalty initiatives. 
  • Translate analytical findings into strategic recommendations to improve marketing performance and business outcomes. 
 

Requirements: 

  • 6–8 years of experience in data analyticsmarketing analytics, or data science, with a strong technical foundation. 
  • Proven experience working with Customer Data Platforms (CDP) and marketing data ecosystems. 
  • Hands-on expertise with Adobe AnalyticsGoogle Analytics, and related CRM/data management platforms. 
  • Advanced proficiency in SQL and experience working with large-scale datasets and cloud environments. 
  • Strong programming skills in Python or R for statistical analysis, modeling, and data processing. 
  • Deep understanding of marketing attribution, ROI modeling, and customer journey analytics. 
  • Experience in data visualizationdashboarding, and storytelling through data. 
  • Strong foundation in statistical testingA/B testing, and experiment design. 

Preferred Qualifications: 

  • Experience with cloud-based data platforms (e.g., Snowflake, BigQuery, AWS, GCP). 
  • Hands-on experience in CDP configuration, audience segmentation, and data activation workflows. 
  • Knowledge of Agentic AI applicationspredictive modeling, and real-time personalization frameworks. 
  • Background in consumer-centric industries such as retail, eCommerce, or financial services. 
  • Strong understanding of data governance, metadata management, and data quality frameworks. 
 

Job Features

Job Category

Contractual

Role Overview:  We are seeking an experienced Lead Data Analyst with deep expertise in Customer Data Platforms (CDP), Agentic AI, data management, and marketing analytics. This role focuses ...View more

Remote
Delhi
Posted 2 months ago

Role Overview: 

We are seeking a highly skilled Marketing Analytics Specialistwith strong experience in data analytics, marketing performance measurement, and applied AI/ML techniques. In this contract role, you will support advanced marketing strategies by leveraging data science to optimize campaigns, understand customer behavior, and drive business growth. The ideal candidate brings a blend of technical expertise, business acumen, and hands-on experience using machine learning in a marketing context. 

Key Responsibilities: 

  • Analyze marketing performance across channels using advanced data techniques. 
  • Build dashboards and reports to monitor KPIs such as customer acquisition, conversion, engagement, and ROI. 
  • Develop and apply machine learning models for customer segmentation, churn prediction, recommendation systems, and lifetime value estimation. 
  • Design and evaluate A/B tests and experiments to support marketing optimization. 
  • Partner with cross-functional teams (Marketing, Product, Data Engineering, Data Science) to define metrics and improve campaign targeting and personalization. 
  • Translate complex analytical findings into clear business insights and recommendations. 
  • Ensure high standards of data quality, integrity, and governance across tools and systems. 

Requirements: 

  • 6–8 years of experience in marketing analytics, data science, or related field. 
  • Proven experience applying AI/ML techniques to marketing problems (e.g., clustering, classification, regression, recommendation engines). 
  • Advanced proficiency in SQL and experience working with large datasets. 
  • Strong programming skills in Python or R for data analysis and modeling. 
  • Familiarity with marketing platforms and tools (e.g., Google Analytics, Adobe Analytics, CRM systems). 
  • Proficiency in data visualization tools (e.g., Tableau, Power BI, Looker). 
  • Solid understanding of marketing funnels, attribution, and customer journey analytics. 
  • Experience with statistical testing, experimentation, and A/B testing. 

Preferred Qualifications: 

  • Experience working with cloud-based data platforms (e.g., Snowflake, BigQuery, AWS, GCP). 
  • Exposure to CDPs, marketing automation platforms, and real-time data pipelines. 
  • Background in consumer-facing industries such as retail, eCommerce, or financial services. 
  • Knowledge of predictive modeling and personalization frameworks. 

Job Features

Job Category

Contractual

Role Overview:  We are seeking a highly skilled Marketing Analytics Specialist with strong experience in data analytics, marketing performance measurement, and applied AI/ML techniques. In this contr...View more

Hybrid
Delhi, Noida, Uttar Pradesh, India
Posted 3 months ago

About the Role: 

We are seeking a dynamic and experienced AI Sales Head to lead our AI services and solutions sales efforts across India. This strategic leadership role demands a consultative sales background, strong understanding of AI products and services, and a proven track record of driving revenue growth in enterprise and mid-market segments, while keeping overall profitability in mind.  

Key Responsibilities: 

  • Sales Leadership:Own and drive the sales strategy for technical and consulting services, including AI-based solutions, across India. 
  • Lead Generation & Pipeline Building:Proactively identify, qualify, and develop new business opportunities through a combination of outbound prospecting, partner channels, events, and inbound marketing leads. 
  • Client Engagement:Build and nurture relationships with key decision-makers (CXO level) of existing customers across industries to understand their business needs and position our services as strategic solutions. 
  • Pan-India Expansion:Manage and grow a sales pipeline across regions with a focus on enterprise accounts, ensuring consistent YoY growth. 
  • Solution Selling:Lead consultative sales engagements, working closely with technical teams to craft tailored proposals and value propositions. 
  • Partnership & Alliances:Collaborate with technology partners, AI platform providers, and system integrators to expand our solution portfolio. 
  • Team Collaboration:Work closely with delivery, pre-sales, and product teams to ensure successful project handoffs and customer satisfaction. 
  • Sales Forecasting & Reporting:Maintain accurate sales forecasts and provide regular updates to senior management. 
 

Requirements: 

  • Experience:Minimum 5+ years in technical or consulting services sales, preferably with exposure to AI/ML technologies, cloud services, or enterprise digital transformation projects. 
  • Domain Expertise:Strong understanding of AI-based services and solutions (e.g., AI consulting, AI product implementation, automations, data platforms). 
  • Geography:Must be based in the Delhi NCR region and open to travel across India as needed. 
  • Sales Acumen:Demonstrated ability to close complex, high-value deals and manage long sales cycles. 
  • Communication:Excellent presentation, negotiation, and stakeholder management skills. 
  • Travel:Willingness to travel up to 50%of the time across India for client meetings and events. 

Preferred Qualifications: 

  • Experience working with Indian IT consulting firms. 
  • Familiarity with enterprise AI adoption trends and technologies. 
  • Existing client relationships in BFSI, Manufacturing, Retail, or Public Sector. 

Compensation: 

  • Competitive base salary 
  • Attractive commissions and incentivesbased on performance and deal closures 
  

Job Features

Job Category

Full Time

About the Role:  We are seeking a dynamic and experienced AI Sales Head to lead our AI services and solutions sales efforts across India. This strategic leadership role demands a consultative sales...View more

On-site
Sunnyvale, United States
Posted 3 months ago
We are seeking a highly skilled Java Backend Engineer with expertise in agentic workflows and MCP (Model Context Protocol) service integration to join our engineering team. The ideal candidate will design, develop, and optimize backend systems that power intelligent, autonomous agents and context-driven services. You will collaborate with cross-functional teams to architect scalable solutions that leverage cutting-edge AI workflows while maintaining reliability, performance, and security. *Responsibilities* – •Design, develop, and maintain scalable Java-based backend services to support agentic workflows and AI-driven applications. •Implement and optimize MCP services to enable seamless context sharing and dynamic orchestration between models, agents, and tools. •Architect APIs, microservices, and event-driven systems that ensure high performance, reliability, and low-latency communication. •Collaborate with data scientists, AI/ML engineers, and frontend developers to integrate agentic intelligence into production systems. •Write clean, maintainable, and testable code while following best practices in software engineering. •Monitor, troubleshoot, and optimize system performance, scalability, and fault-tolerance. •Contribute to workflow automation, context management, and intelligent decision-making systems. •Stay up to date with emerging technologies in AI, distributed systems, and backend engineering *Qualifications/Required skills:* •Bachelor’s/Master’s degree in Computer Science, Engineering, or related field. •5+ years of backend engineering experience with strong expertise in Java (Java 11+) and Python. •Proven experience with agentic workflows (autonomous task orchestration, tool use, context-driven execution). •Hands-on with MCP (Model Context Protocol) service development and integration. •Strong understanding of microservices architecture, RESTful APIs, gRPC, and message queues (Kafka, RabbitMQ, etc.). •Experience with databases (SQL & NoSQL) and caching solutions (Redis, Memcached). •Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes). •Solid grasp of concurrency, multithreading, and distributed systems design. •Proficiency in CI/CD pipelines, testing frameworks (JUnit, Mockito), and code quality tools. •Excellent problem-solving, debugging, and communication skills *Preferred Qualification:* •Experience with AI/ML infrastructure, LLM-based applications, or agent frameworks. •Knowledge of event sourcing, CQRS, and workflow engines (e.g., Temporal, Camunda, Airflow). •Contributions to open-source projects related to agentic systems or MCP. •Understanding of observability (logging, tracing, metrics) in distributed systems.

Job Features

Job Category

Full Time

We are seeking a highly skilled Java Backend Engineer with expertise in agentic workflows and MCP (Model Context Protocol) service integration to join our engineering team. The ideal candidate will de...View more

On-site
Sunnyvale, United States
Posted 3 months ago
Responsibilities –
  • Continuous  Deployment using GitHub Actions, Flux, Kustomize
  • Design and implement cloud solutions, build MLOps on AWS cloud
  • Data science model containerization, deployment using Docker, VLLM, Kubernetes
  • Communicate with a team of data scientists, data engineers, and architects, and document the processes
  • Develop and deploy scalable tools and services for our clients to handle machine learning training and inference.
  • Knowledge of ML models and LLM
Qualifications:
  • 6+ years of experience in ML Ops with strong knowledge in Kubernetes, Python, MongoDB and AWS.
  • Good understanding of Apache SOLR.
  • Proficient with Linux administration.
  • Knowledge of ML models and LLM.
  • Ability to understand tools used by data scientists and experience with software development and test automation
  • Ability to design and implement cloud solutions and ability to build MLOps pipelines on cloud solutions (AWS)
  • Experience working with cloud computing and database systems
  • Experience building custom integrations between cloud-based systems using APIs
  • Experience developing and maintaining ML systems built with open-source tools
  • Experience with MLOps Frameworks like Kubeflow, MLFlow, DataRobot, Airflow etc., experience with Docker and Kubernetes
  • Experience developing containers and Kubernetes in cloud computing environments
  • Familiarity with one or more data-oriented workflow orchestration frameworks (Kubeflow, Airflow, Argo, etc.)
  • Ability to translate business needs to technical requirements
  • Strong understanding of software testing, benchmarking, and continuous integration
  • Exposure to machine learning methodology and best practices
  • Good communication skills and ability to work in a team
  Note: Focus is to have 60% SRE and 40% ML Ops…  
Skill AreaIncludesWeight (%)
Platform Reliability & ContainerizationKubernetes, Docker, Microservices, Linux30%
MLOps & AWS CloudModel deployment, versioning, monitoring, AWS (SageMaker, S3, Lambda, EKS)25%
CI/CD & GitOpsGitHub Actions, Flux15%
Monitoring & ObservabilitySplunk, Grafana, Prometheus, performance tracking15%
Integration & CollaborationPython scripting, API integrations, Apache Solr, LLM awareness, teamwork with data scientists & engineers15%

Job Features

Job Category

Full Time

Responsibilities – Continuous  Deployment using GitHub Actions, Flux, Kustomize Design and implement cloud solutions, build MLOps on AWS cloud Data science model containerization, deployment using ...View more

On-site
Sunnyvale, United States
Posted 3 months ago
Job Overview: We are seeking a Technical Lead with strong managerial capabilities to drive the design, delivery, and optimization of AI/ML and data platform initiatives. The ideal candidate will coordinate multiple workstreams, lead cross-functional teams, and ensure high-quality, timely execution while maintaining strategic alignment with business goals. This role combines hands-on technical expertise with project management skills. Key Responsibilities:
  • Coordinate multiple workstreams simultaneously, ensuring timely delivery and adherence to quality standards.
  • Facilitate daily stand-ups and syncs across global time zones, maintaining visibility and accountability.
  • Understand business domains and technical architecture to enable informed decisions and proactive risk management.
  • Collaborate with data engineers, AI/ML scientists, analysts, and product teams to translate business goals into actionable plans.
  • Track project progress using Agile or hybrid methodologies, escalate blockers, and resolve dependencies.
  • Own task lifecycle — from planning through execution, delivery, and retrospectives.
  • Perform technical reviews of data pipelines, ETL processes, and architecture, identifying quality or design gaps.
  • Evaluate and optimize data aggregation logic while ensuring alignment with business semantics.
  • Contribute to the design and development of RAG pipelines and workflows involving LLMs.
  • Create and maintain Tableau dashboards and reports aligned with business KPIs for stakeholders.
  • Mentor and guide team members, fostering knowledge sharing, collaboration, and skill development.
Required Skills
  • Strong expertise in data pipelines, architecture, and analytics platforms (e.g., Snowflake, Tableau).
  • Experience reviewing and optimizing data transformations, aggregations, and business logic.
  • Hands-on familiarity with LLMs and practical RAG implementations.
  • Knowledge of AI/ML workflows, model lifecycle management, and experimentation frameworks.
  • Proven experience in managing complex, multi-track projects.
  • Skilled in project tracking and collaboration tools (Jira, Confluence, or equivalent).
  • Excellent communication and coordination skills with technical and non-technical stakeholders.
  • Experience working with cross-functional, globally distributed teams.
Additional Skills
  • Exposure to cloud platforms (AWS) and modern DevOps practices.
  • Ability to create executive-level dashboards and summaries for leadership.
  • Strong analytical and problem-solving mindset.
  • Understanding of semantic layers for consistent business logic interpretation.

Job Features

Job Category

Full Time

Job Overview: We are seeking a Technical Lead with strong managerial capabilities to drive the design, delivery, and optimization of AI/ML and data platform initiatives. The ideal candidate will coo...View more

On-site
United States
Posted 3 months ago
Job Overview: We are looking for an experienced Data Engineer with AI Insights to design and deliver robust data solutions that enable AI-driven analytics and decision-making. The role combines strong expertise in Snowflake and Python with hands-on experience in building ETL/ELT pipelines, preparing data for AI/ML workflows, and supporting modern LLM and RAG-based applications. Key Responsibilities:
  • Develop and optimize ETL/ELT pipelines using Python, SQL, and Snowflake to ensure high-quality data for analytics, AI, and LLM workflows.
  • Build and manage Snowflake data models and warehouses, focusing on performance, scalability, and security.
  • Collaborate with AI/ML teams to prepare datasets for model training, inference, and LLM/RAG-based solutions.
  • Automate data workflows, validation, and monitoring for reliable AI/ML execution.
  • Support RAG pipelines and LLM data integration, enabling AI-driven insights and knowledge retrieval.
  • Partner with business and analytics teams to transform raw data into actionable AI-powered insights.
  • Contribute to dashboarding and reporting using Tableau, Power BI, or equivalent tools.
Qualifications
  • 5+ years of Data Engineering experience with exposure to AI/ML workflows.
  • Advanced expertise in Python programming and SQL.
  • Hands-on experience with Snowflake (data warehousing, schema design, performance tuning).
  • Experience building scalable ETL/ELT pipelines and integrating structured/unstructured data.
  • Familiarity with LLM and RAG workflows, and how data supports these AI applications.
  • Experience with reporting/visualization tools (Tableau)
  • Strong problem-solving, communication, and cross-functional collaboration skills.
Preferred Skills
  • Experience with MLOps pipelines and ML frameworks (Scikit-learn, TensorFlow, PyTorch).
  • Knowledge of cloud data platforms (AWS) and orchestration tools (Airflow, dbt).
  • Exposure to containerization (Docker, Kubernetes) and CI/CD practices.
  • Ability to partner with business and analytics teams to deliver actionable insights.

Job Features

Job Category

Full Time

Job Overview: We are looking for an experienced Data Engineer with AI Insights to design and deliver robust data solutions that enable AI-driven analytics and decision-making. The role combines strong...View more