Discover JobsUnlock your career potential with Discover Jobs—a gateway to endless opportunities and professional growth.
Cloud experience . Big data ecosystem experience. Need a strong analyst with cloud migration project ep . Open API, DATA modeling , ETL , good exposure to AWS, strong on SQL.
- Python - API’s - Google Cloud (GCP) – but someone with AWS or Azure could work - Good Cloud Services is strongly preferred - Containers - Different databases for querying - PostgresSQL - Understanding of how containers work C++ is a big plus, strong object oriented programming
8 Years of experience in Informatica Development and database 2- 4 years of experience in Teradata Good Knowledge on DWH concepts Exposure in AWS related data services Ability to gather requirements and convert business requirements to technical requirements. Prepare technical specifications documents. Experience of developing ETL application using Informatica PowerCenter (development experience, not just support experience). Involved in creating the mappings in order to extract the data from source to target using Informatica designer. Evaluate all functional requirements and map documents and perform troubleshoot on all development processes Documents all technical specifications and associate project deliverables. Design all test cases to provide support to all systems and perform unit tests. Should analysis the data and develop understand the ETL specifications. Develop mappings if required Develop Informatica 9x & 10x mapping, session and workflow. Involved in creating the sessions, command tasks, workflows using the workflow designer. Good at working with transformations including complex transformations Involved in building AutoSys to schedule the ETL Workflows. Familiar with Stored Procedures and implementing them in the ETL jobs of Informatica. Worked on the HLD and LLD document. Very good at UNIX and LINUX shell scripting Strong Database Experience with strong knowledge of writing complex SQL scripts as well as SQL tuning. Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks. Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills. Team player, Motivated, able to grasp things quickly with analytical and problem solving skills. Comprehensive technical, oral, written and communicational skills Should have Strong Analytical and interpersonal skills with good written and verbal communication.
12 to 18 Years of Experience in Directing and leading the delivery of highly complex IT applications and systems, which includes leadership of multiple dependent projects across multiple functional areas and technology platforms. Responsible for all program oversight, including planning, directing, and managing cross functional team members. Experience in managing End-to-end AWS data lake implementation programs. Leads meetings with executive business partners and technology leadership to understand business goals and solutions needs, including participation in business requirements definition. Develops program level vision, strategy, approach, and implementation plans required to achieve necessary business outcomes. Facilitates the creation of business cases, program roadmaps, and program governance structures to support Project Managers. Ensures the integrated scope, schedule, budget, and quality of multiple project teams. Extensive Experience with agile development methodologies (i.e. Scrum, Kanban, Lean, etc.) as well as hybrid iterative development approaches Expertise in Managing and Handling offshore onsite based delivery model in Agile development environment Responsible for developing detailed project charters and plans, specifying goals, objectives, strategy, scope, budget, scheduling, risk mitigation strategies, contingency plans, approach, requirements, deliverables, timelines, work breakdown structures, use case, test case, and training needs, as necessary for all project assignments. Identifies, tracks and manages program level risks and issues. Serves as escalation point to resolve program issues, remove roadblocks, formulate contingency plans and communicate status to programs executive leadership team. Ensures the sequencing of projects is appropriate at the program level. Functions as central point of program communication and status for program team members, stakeholders, management and executives. Communications are professional, timely, and clear and present the correct level of information based on audience. Directs the application of various technical disciplines such as process improvement, data analysis, architectural review, quality assurance and facilitation. Provides objective advice, expertise and leadership support to program team members with the aim of creating value, driving program work, and delivering to business expectations. Accountable for adherence to established delivery standards and industry best practices to insure successful project delivery leads to the overall program goals being achieved.
Scientific Games has a large scale effort kicking off where they need 4 specific Oracle EBS Functional-Technical Quote to Cash consultants with (OM, Quoting, Advance Pricing and any experience with "Contract Expert" is good). These are for functional/technical resources with functional business skills and also technical experience with extracts, tables and ability to create joints or joins. This is a very complicated environment and the consultants will have to be stand alone lead type resources on both the functional and technical aspects with the exception of coding. This is starting up in September and will run for 6 months +.
• Expertise in vRealize Suite Enterprise and SaltStack Security • Certification in server platforms for Microsoft, Linux and Solaris as well as VMware Virtualization is desired • Five (5) years previous systems engineering experience • Operational experience with industry standard security protocols and procedures is desired • Minimum of five (5) years’ experience in server and storage infrastructure implementation and support • Minimum of five (5) years’ experience in data center operations • Experience with VMware vSphere (version 6.5)
As a critical member of the DPSCD’s Technology Infrastructure Team, the Enterprise Engineer will be responsible for enterprise level system administration, backup, restore and disaster recovery. The Enterprise Engineer will also work collaboratively with other members of the Technology team on the deployment, operations and management of enterprise-level infrastructure systems including VoIP, Video Management Systems and Cloud solutions.
ASenior Software Engineer - Backenddevelops systems throughout the client's infrastructure. This involves all elements of software engineering, including application design, system architecture, schema design and API development. WHAT WILL YOU DO? Develop readable, reliable, maintainable and performant APIs, applications, and libraries Collaborate closely with engineering team members and product stakeholders Integrate with internal and third-party APIs, applications and data sources Ensure application observability in the form of metrics, logging, and monitoring Utilize cloud infrastructure in collaboration with the SRE team to build scalable systems Uphold engineering quality and performance standards Provide technical mentorship to adjacent team members Deliver. Customer. Value.
• Oversee server builds, deployments, packaging, integration, automation and releases of our Software. • Partner with other Engineering leadership in the team Continuously collaborating with team(s) to evaluate Monitoring and Performance of the • environment & build Standards and consistent practices around Infrastructure as Code. • Responsible for process development, ensuring opportunity to automate scripts for software builds and deployments and lead in helping • streamline the pipeline • Participate in Technology/Architecture and design discussions and required to contribute to developing the solutions architecture, designing • automation templates for infrastructure provisioning, configuration & change management, and developing the detailed technical design. • Lead researching on new tools/ technology solutions/processes, build consensus and plan to introduce cost justifiable technology to improve • SDLC pipeline • Provide day to day technical leadership to the application engineers and be an evangelist of Devops principles to the team(s) • Continually finding ways to optimize and building out a comprehensive architecture/strategy for supporting applications in production • The scope of work includes build, machine configuration, test infrastructure and virtual machine deployment. • Level-3 support for Production Environment • Responsible for protecting and securing all client data held by Ascensus to ensure against unauthorized access to and/or improper transmission • of information that could result in harm to a client. • The I-Client philosophy and the Core Values of People Matter, Quality First and Integrity Always® should be visible in your actions on a day to • day basis showing your support of our organizational culture.
• Looking for either a Life70 or Performance Plus Developer for MSA Production Support needs. • Life70 - 5-7 years’ Mainframe development in COBOL and Assembler based systems • Minimum of 3-5 years’ experience Life70 application with good knowledge of Life Insurance domain • Proficiency in Assembler, COBOL,JCL, VSAM Quality Center, File-AID, Dump-Master, Change-man, Control-M, Xpeditor, File Manager, Quik Job, DYL280, MQ Series, CICS as well as batch. • Proficient in designing application interfaces for generic and customized feeds. • Must be able to study current state and articulate future state design. • Must be able to code mine Assembler programs and convert them into COBOL modules. • Must have good communication effectively, explain the problem and propose solutions • Be flexible and efficiently multi-task • Take initiative and ownership of issues and resolutions • Experience in working with multiple service providers in high paced business critical applications. • Performance Plus - 5-7 years’ Mainframe development in COBOL and DB2. • Minimum of 3-5 years’ experience on Performance Plus / DSS application with good knowledge of Life Insurance domain • Proficiency in COBOL, DB2, JCL, VSAM Quality Center, File-AID, Dump-Master, Change-man, Control-M, Xpeditor, File Manager, DYL280, MQ Series, CICS as well as batch. • Proficient in designing application interfaces for generic and customized feeds. • Must be able to study current state and articulate future state design. • Must be able to code Cobol & DB2 programs • Must have good communication effectively, explain the problem and propose solutions • Be flexible and efficiently multi-task • Take initiative and ownership of issues and resolutions • Experience in working with multiple service providers in high paced business critical applications.
• architecting experience with PAN • Assist with migration from Blue Coat to PAN • Assist with day to day administration operations of PAN • Assist with troubleshooting of PAN appliances • Provide direction and guidance in the day to day operations of the PAN appliances • Assist in the coordination and completion of security standard, process, and procedure documentation related to the PAN environment. • Monitoring security environment, identifying security gaps, evaluating and implementing enhancements related to the PAN environment • Provide informal knowledge transfer throughout the engagement • Perform other duties as requested
• Procure to pay including purchase order process and 3-way matching • Inventory migration from Odoo into NetSuite Build of Materials cleansing Supplier master data cleansing Part numbering optimization • Fixed asset implementation (move assets from Excel into NetSuite • Order to cash Sales orders Invoicing Revenue Cash collections Receipt to fulfill (Operations) Salesforce integration CAD application integration
managing Azure, VPN tunnels, VLAN Setup, etc
2+ years experience developing trending reports using ServiceNow Performance Analytics 2+ years experience developing, publishing, and scheduling ServiceNow Reports Strong understanding of ServiceNow tables key to Incident, Problem, Change, and CMDB Demonstrate structured thinking, which is the ability to break down a problem into multiple parts to efficiently solve the problem at hand High integrity Self-motivated and dependable Works well in a team Great communications skills Strong listening skills
* 3-5+ years as a Data Engineer working with relational databases (SQL) * Advanced SQL Server skills * Intermediate skills of ETL (Extract, Transfer Load) * Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. * Strong analytic skills related to working with unstructured datasets. * A successful history of manipulating, processing and extracting value from large disconnected datasets. NICE TO HAVE (NOT REQUIRED) QUALIFICATIONS: * AzureDataFactory * Experience building and optimizing big data datapipelines, architectures and datasets. PROJECT DESCRIPTION: Data Engineer position works in a modern tech environment supporting cloud and big data initiatives * Modern Technical Working Environment achievement-oriented atmosphere * Agile/Scrum methodology * Microsoft Technology Stack
10+ years of experience in applications development using Java and Java related frameworks. Strong knowledge of Object-Oriented Programming, web services and database concepts Experience with Spring and Spring MVC frameworks including REST Experience using JPA/Hibernate Experience with Struts and JSP technologies Knowledge of HTML, CSS, Ajax and jQuery Experience in XML messages parsing via web services Strong background in SQL, JDBC and application-database performance / troubleshooting (Oracle DB)
· Manufacturing Operation Technology (OT) and Industrial Control System (ICS) Security Analyst · Drive Viatris information security strategy and vision for Manufacturing Cyber Security initiatives · Provide in-depth Manufacturing and Security knowledge for Manufacturing Cyber Security initiatives with respect to implementation and ongoing support to ensure strategic alignment with manufacturing objectives · Work collaboratively with Manufacturing Engineering to create, test, and implement security solutions while minimizing the impact to on-going production · Perform solution testing in a lab (proof of concept) environment, followed by implementing the solution on the manufacturing plant floor · Develop and maintain standard documentation regarding how to implement and maintain manufacturing cybersecurity solutions · Development and/or contribute to solutions for complex, cross-functional issues to resolve Manufacturing Cyber Security vulnerabilities · Analyze industry and cyber security trends to ensure manufacturing strategies are in accordance with IT Compliance qualification standards
We are looking for a Tech Program Manager for the Logistics functional domain, supporting cross functional projects across Logistics domain including Warehouse Management platform DC and Hub Rollouts, improving DC Operations efficiencies, Final Mile and Transportation projects, and Returns. This role will primarily focus on these Large Logistics programs with established project management methodologies/tools to develop and execute Technology project plans in alignment with our company and Vendor’s Software Development Lifecycle.
As an API Developer, this position will design, develop, enhance, document, and support interoperability solutions as part of a team developing innovative new healthcare and shared business systems integrations. This role requires a highly-motivated professional who can help build-out our API capabilities. This role will establish API-standards and maintain high availability for the services we manage, while also implementing the infrastructure solutions that our product teams need in order to succeed. This role will enable the efficient delivery of high-quality, scalable and maintainable solutions by working with cross-functional teams (i.e. applications, architecture, data management, analytics, infrastructure and security) to deliver information efficiently, while architecting solutions around best practices and reusability. Understanding the complexity of system/data integrations this role requires competencies in SDLC, message patterns, endpoint security, interfaces between internal and external applications, and solid database knowledge. This role entails knowing one or more programming languages; knowing one or more development methodologies and delivery models. This role will develop APIs to enable the transformation of real-time interoperability capabilities empowering our product teams toward self-service.
5+ years of US based, professional experience experience with ASP.NET web development experience with web services (SOAP, REST, XML, JSON) automation experience (UI, API) ability to go onsite 5 days/week in the Lincolnshire office