Discover JobsUnlock your career potential with Discover Jobs—a gateway to endless opportunities and professional growth.
Perform high-level systems analysis, evaluation, design, deploying and configuring secure NAS (NetApp), 3PAR storage (SAN) and storage networking (Cisco MDS) technologies integration and implementation of various virtualization (VMware), compute (Cisco UCS) Build, manage and support – highly available, secure and scalable – production and lab infrastructure (compute and storage) in company's global data centers and public clouds Monitor production and lab environments, troubleshoot alerts & incidents, detect and fix infrastructure problems Research new compute, virtualization and storage technologies and make recommendations to improve availability, reliability and performance of these systems supporting company's external facing applications Automate/script daily operational support tasks Create/maintain documentation of the infrastructure and configurations (architecture diagrams, procedures, etc.) and provide training to relevant staff (within and outside the immediate team) Participate in on-call rotation as needed
• Expert with over all 15+ yrs of IT experience in Lead roles in Azure Infrastructure Provisioning and Security Administration with at least 6 yrs. of Azure experience • Expert in provisioning and configuring Compute, Network, Storage, AI, and ML Azure components. • Be responsible for establishing standards, best practices, and blueprints for infrastructure provisioning and security administration of Azure Data, Analytics & AI components. • Expert in enabling trending tools and technologies on Azure Platform for development teams to adopt and deliver applications on. • Must have hands-on expertise in Azure DevOps. Must be responsible for designing Azure DevOps pipelines for IaC and code promotions between environments. • Establish guidelines and processes for disaster prevention and recovery. • Expert in automating Azure Backup and Recovery mechanisms. • Ideate and develop automations for Azure provisioning, security, and DevOps implementations. • Must be an expert in Azure CLI/Powershell scripting, ARM templates, and YAML. • Work with Risk & Compliance Teams to ensure Azure environment adheres to Corporate & Industry Guidelines. • Must be an expert troubleshooter of Azure Infrastructure and Development issues. • Must be able to explore and gain authoritative knowledge on new concepts rapidly. • Must have excellent communication and stakeholder management skills.
Data Engineer background (understands ELT pipeline) and responsible for the quality of the Data. Will be doing white box testing in the pipeline using automation tools & scripting. DBT uses SQL & Jinja as code and Great Expectations uses Python as the code.
· At least 10 years of experience · .Net Core API · K8s deployments · Experience with lift to Azure from on-prem · Terraforms, ADO Pipelines · OAuth Security · Event Driven Architecture – Kafka/ESBs
Responsible for eliciting, understanding, interpreting and representing business requirements. Act as the conduit between the customer and technical teams to ensure requirements are understood. Responsible for understanding business processes to develop business models. Provide subject matter expertise on the use of data as well as educate teams on business model, metadata and standards. Provide leadership and mentoring to junior associates. Acts as a change agent in support of operational improvements. Recommend and establish standards, guidelines and procedures to improve operational efficiencies. Responsible for understanding source systems and its data models. Develop source to target mappings for data lineage. Document source architecture to include data flows. Responsible for analyzing data to validate business domains and requirements. Responsible for data profiling and ensuring data quality requirements are accurate and complete. Act in an advisory capacity in data model reviews, architecture approach and solution design to ensure high quality deliverables. Responsible for partnering with management and business units on innovative ways to successfully utilize data and related tools to advance business objectives. Works with governance council to establish data governance standards and guidelines. Responsible for active involvement with Quality Assurance teams in defining the QA strategy, test plans and use cases to validate data. Responsible for providing day to day support, troubleshooting and incident management and resolution. Responsible for communicating planned and unplanned activities to appropriate parties as needed. Support PMO Pre-project data assessments Support PMO conversion of Tableau / Cognos reports to Microsoft PoweBI Writing PowerBI reports and dashboards Provide ad hoc support for Business reporting requests Assist with business data lake testing / experimentation Assist with coordinating data dictionary completions Mentor Project DA resources
• Lead design of Hadoop and SQL based solutions • Perform development of Hadoop and SQL based solutions • Perform unit and integration testing • Collaborate with senior resources to ensure consistent development practices • Provide mentoring to junior resources • Participate in retrospective reviews • Participate in the estimation process for new work and releases
The FoodChain ID Technology team is charged with developing the first of its kind comprehensive and integrated Food Technology platform in the industry. We aim to digitalize the process of food safety compliance, certification, and food-related data services in an end-to-end unified, seamless, and frictionless process. This platform enables our customers to make complex decisions regarding food safety, new product development, managing food supply chains, and managing regulatory and label compliance with high speed and accuracy. We use the latest technologies and tools including AI and Machine Learning to provide our customers one of a kind solution. The Azure Developer will be responsible for developing the Data Services Platform (DSP) aligned with FoodChain ID’s strategic mission and vision by translating customer, business, and market needs into features/functions for the company’s technology platforms.
Android Developer with Kotlin Handling and coordination with offshore and onsite 5-7 Years of over all experience
Must have 6+ years of experience with .Net Must have 3 years of experience Azure
• 4-6 years of experience in BI tools development like Looker , Tableau etc. • At least 1 year of experience with Looker • Experience in interacting with business to gather requirements and develop reports and dashboards • Some experience in Python and other scripting languages • Good understanding of data extraction and loading • Exposure to Snowflake is a plus
· Scrum Master certification with deep understanding of all things Agile (Scrum, Kanban and other methodologies) · 3+ years multi-project management experience in a technical (SDLC) environment · Keen understanding of release and deployment processes for frontend and backend platform. · Experienced and strong technical understanding with video CMS, streaming technologies, live stream operations, or metadata/video localization · Strong understanding of Engineering platforms and tools including but not limited to effective branching and merging strategies, pull requests, and code reviews using Bitbucket, GitHub, Jenkins, CI/CD workflows · Proficiency Jira (Fix Version management and reporting dashboards), Confluence, Google Suite, and able to whip up a presentation as needed on the status of releases.
Good Knowledge in Oracle Database 12c/19cGood Knowledge in ERP 12.1/12.2 Experience on DB/ERP Cloning activities Ability to work on Production Support DBA Activities Ability to work independently and handle a team, work with business to gather requirements, prepare technical specification documents Ability to create/review/audit technical specifications Adhere to internal processes and tools to deliver projects, programs Participate in regular meetings and provide status updates Ability to work with distributed teams Must be self-motivated with strong team orientation and the ability to learn quickly Must be organized and capable of handling multiple projects and deadlines Must possess excellent written and verbal communication and interpersonal skills Additional SkillsShell/Perl/Python Scripting
· Experience with Dell Boomi Architecture · Experience with Dell Boomi EDI and Complex Mappings · Skills in SQL scripting and database. · Broad knowledge of application integration best practices, design patterns, and technical industry standards · Strong technical skills in middleware, integration, security, deployment and configuration · integrations and the development of Web Services and API’s · Preferred to get a DELL BOOMI Certified Candidate · Developing SFDC CRM workflows and custom solutions for specific business needs · APEX, Strong Visual Force development · Lightning (LWC) UI development Experience
You will be responsible for developing and implementing Cargo technology product strategy for Salesforce (SFDC), managing functionality, and optimizing performance.
Lead design, development, migration and implementation of several complex web based java applications in/to Azure cloud environment using Java/J2EE development within the software development lifecycle (SDLC) Support Architecture team in defining the technology roadmaps for the applications Understand the business needs and convert them into Technology and cloud architecture fit by leveraging architecture approved tools and solutions Have proven experience with Cloud technologies, using Java as native development language. Experience working with Kubernetes and other container technologies to lead the team towards the right decision. Estimate the amount of work related to completion of the given work, with 95% accuracy, having minimal input on the actual work Proven experience with planning out the development schedules and defining the technical project milestones working with project management and delivery manager. Proven Experience in articulating the project deliverables with technical details, understand the architectural guidelines, present the architectural details to large board of members and others. Understand and be able to develop applications using the following environment/languages Java J2EE JSP AJAX EXT-JS 4.0 Preferred Angular JS jBoss Application Server Web application architecture MVC Web Services, XML, Soap Javascript HTML CSS DB2/UDB/Oracle Exadata/SQL Server Devops pipelines
Pharma Product Owner / BA with Digital background with good CMS background.
Plans and executes on Business work stream deliverables within a project. This project will give the candidate the opportunity to learn international business, reinsurance and exposure to collaborate across multiple functions of the company.
Responsible for implementation and ongoing administration of Hadoop infrastructure on Cloudera Data Platform CDP and CDH. Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. Working with internal teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive and MapReduce access for the new users. Performance tuning of Hadoop clusters and Hadoop MapReduce routines Screen Hadoop cluster job performances and capacity planning Monitor Hadoop cluster connectivity and security Manage and review Hadoop log files. File system management and monitoring. Teaming with the infrastructure, network, database, application, and business teams to guarantee highly available systems. Collaboration with application teams to install Hadoop updates, patches, version upgrades when required. Collaborate with end users to resolve complex issues/provide solutions that meet business needs and benefit system performance Troubleshoot application problems related to configuration, network, and server issues, and provide solutions to recovery. Participate in postmortems to avoid repeated incidents. User support and provisioning access Process improvement, including the creation of new automation to streamline manual tasks Develop and update documentation, departmental technical procedures, and user guides.
• Create and maintain optimal data pipeline architecture, including building a data pipeline infrastructure for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL , cloud based relational or non-relational databases employing Talend, and/or scripting languages like Perl/Python • Identify, design, and implement internal process improvements, including automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. • Work with data and analytics experts to strive for greater functionality in the organization’s data integration platform • Collaborate with technical staff to identify, learn, and understand software problems • Follow established configuration/change control processes • Identify options for potential solutions and assessing them for both technical and business suitability • Work closely with peers, stakeholders, and end users to ensure technical compatibility and user satisfaction
· With minimal supervision, implement business and IT data requirements through new data strategies and designs across complex data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization and analytics). · Work with business and application/solution/architect teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models · Collaborate on a day to day basis with business partners, data architects and data engineers to identify, troubleshoot and resolve enterprise data modeling issues · Design conceptual, logical and physical data models while maintaining a data dictionary and capturing metadata · Ensure developed data models are easy to use and efficient to access data · Provide support to business users and the development team in resolving data modeling related issues by performing root-cause analysis, making recommendations and working collaboratively to come up with comprehensive solutions · Facilitate Joint Application Development (JAD) sessions to determine data rules, hold LDM reviews with Data experts, peer Data Modelers, development teams, and architecture teams. · Develop data solutions and strategies based on the analysis of business goals, needs, and existing infrastructure using best practice principles, concepts and techniques · Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices · Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks · Guide less experienced Data Modelers on a day to day basis