Discover JobsUnlock your career potential with Discover Jobs—a gateway to endless opportunities and professional growth.
Must have a minimum of 10+ years of experience. Need someone who has supported application in .NET & Angular. Should have knowledge in Azure Need someone Senior who can act like Tech Lead
This individual will be responsible for leading or participating in Trade Compliance related projects by analyzing, implementing, and supporting applications in accordance with the TEIS development and project management methodologies to support strategic and tactical compliance initiatives. A strong candidate will have a solid background in Export Trade Compliance topics using and supporting automated solutions.
At least 3 years report development and analytic experience Working with users in a requirements analysis and documentation role Knowledge of logical and physical data modeling concepts (relational and dimensional) Healthcare & Clinical workflow experience Experience with Microsoft SQL Server Reporting Services (SSRS) or Power BI. SQL Programming, scripting and stored procedures experience. knowledge of software development life cycle is preferred. Experience with Analysis Services Experience with data cube development and reporting Experience with Microsoft SSIS Requires strong analytical, conceptual and problem-solving abilities Excellent interpersonal ( verbal and written) communication skills are required to support Working in project environments that includes internal and external teams Understanding of data integration issues (validation and cleaning), familiarity with complex data and structures Thorough understanding of complex data models for healthcare and clinical business systems and the underlying associated processes
Haemonetics is moving to Phase II of their Oracle Cloud upgrade. They are looking for an Oracle Cloud Integration Engineer/Analyst who has experience with Oracle Integration Cloud (OIC). This role involves understanding both the business processes and the technical aspects of Oracle integration solutions to ensure effective and efficient integration of various systems and applications. Phase 1 was all ERP Finance - this phase is everything else - ERP Quote to Cash, contracts, fulfillment, Vendor Management, Inventory, Logistics, Warehouse, Quality. Do not want someone only focused on ERP Finance. Ideally this person has worked in a manufacturing or distribution environment.
The candidate will provide a guidance and management of the product rollout. It requires: In-depth understanding of IM, EWM and warehousing best practices System integration experience with EDI, 3Pls, Transportation Management tools and good understanding of best logistics practices Effective communicator with good leadership and problem-solving skills facing project risks Ability to put the clients interests first
Avionic products are installed in a wide variety of civil and military aircraft. These products provide communication, navigation, and guidance information to the pilot. The software engineering team is responsible for producing the avionics software embedded within these products as well as the production test environments for the products.
The Solution Documentation Tech will be responsible for ensuring all articles related to the release of an SAP S/4 Solution Manager 7.2 project implementation (Work Packages, Work Items, Documentation) are in proper order, condition, and status. This includes taking ownership of all items in the Solution Documentation module within Solution Manager 7.2 and driving the Focused Build (FB) activities related to implementations. Tasks include but are not limited to creating daily project audit reports and review with the business, holding FB and SolDoc training sessions, reviewing documentation to ensure it has been updated and providing technical and analytical support when it has not. Responsibilities also include evaluating and driving best practices for SolDoc.
Overall 8-12 yrs of IT experience Must have 5+ years strong experience in Data modelling. Able to understand, analyze and design enterprise data model, and have expertise on some data modelling tool. Experience in designing the Canonical logical and physical data model using the data modeler tool. Have expertise on SQL (PL/SQL) programming and one the languages like Java/Javascript Has built processes supporting data transformation, data structures, metadata, dependency and workload management Has experience on analyzing source data, transforming data, creating mapping document, build design documents, and data dictionary etc. Good experience in end to end Data warehousing, ETL, and BI projects. Hands on experience on any of the ETL tools like Informtica Hands-on experience on at least one CDC tool: Oracle Goldengate, Qlik etc. Has Led or involved on data migration projects end to end Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Proficient in Snowflake(Certified SnowPro (Desirable)) Datawarehouse Experience in Snowflake Data warehouse - data modeling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts Have created flexible and scalable snowflake model Experience with data security and data access controls and design Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface Good to have: Experience with Snowflake utilities, SnowSQL, SnowPipe and Big Data model techniques using Python have experience at least two end-to-end implementation of Snowflake cloud data warehouse. Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features. Should have good understanding of 3NF and Star schema Preferred to have expertise in any one programming language Python/Java/Scala Able to develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modeling. Able to troubleshoot problems across infrastructure, platform and application domains. Should be able to provide technical leadership to large enterprise scale projects.
• 13+ years of experience working as a developer / architect in core banking domain • Good overall functional and product feature understanding of FIS, Jack Henry, FISERV • Deep technical architecture expertise of FIS, Jack Henry, FISERV platforms • Hands on experience designing integration solutions using APIs, Events etc. for FIS, Jack Henry, FISERV • 7+ years of hands on experience with Kafka, MQ, or other event / messaging products • Expert in API specification / Swagger • Experience with API Gateways • Expert in API and application security (OAuth, SAML, OIDC, SSO, etc.) • Provides the architectural leadership in shaping strategic technology programs using API first approach • Strong hands on experience with languages like C#/.Net, Java • Experience working in Azure / AWS PaaS, SaaS and IaaS environment
By joining company's Capital Markets Enterprise Solutions Group, you’ll be at the heart of a team bringing innovative technology to life as part of our digital transformation. As a Senior Data Engineer/ETL Developer, you’ll contribute to the technical design, development and integration of cross-functional, multi-platform application systems. You’ll build easy, flexible, and personalized banking solutions that enhance client experience and help build the bank of the future. We have worked with our client here in Chicago and Canada the last 7 years helping them build out their agile development teams in support of their digital transformation to . Azure Cloud, Azure SQL, Databricks, and ADF The core of the current work on their CECEL (Anti-Money Laundering) systems. Doing ETL work out of their data in AML/Loan Origination applications using Talend, AzureSQL, Query development/optimizations, EDW Development Experience in ETL/ELT specifically on data heavy applications like Enterprise Data warehouses Have good knowledge of Kimball Imnon models – SCD, Dimensions, Facts, Data modeling – Star, snowflake SSIS is good, but if they also have experience with tools like Informatica, Datastage, Talend, Pyspark, Databricks – that is great. We will be working heavily in Talend as of now (can teach if experienced in Informatica or Datastage)
Serve as a senior software engineer, contributing to the ongoing development efforts and new initiatives. Guide and Mentor the development teams, including internal and external team members. Contribute to the ongoing evolution of the existing portfolio of applications and services. Design, develop, modify, implement, and support software components anywhere in the software stack. Determine root cause for complex software issues and develop efficient, and long term technical solutions. Work in close partnership with cross-functional teams and management.
Look for 10+ years of experience. Snowflake Developer MuleSoft is must.
Excellent relationship-building skills and ability to liaise with stakeholders at all levels Gathering requirements and proposing technical solutions Proactive attitude to platform enhancements including review of existing records. Desire to implement best practice solutions Understanding of Salesforce sharing and security (roles, profiles, permissions, OWD, sharing rules). All aspects of user and license management including new user setup/deactivation, roles, profiles, permissions, public groups. Experience implementing Salesforce configuration changes including (but not limited to): Workflow, Process Builder, fields, page layouts, record types, custom settings, dashboards and reports User support tickets Salesforce configuration changes, including (but not limited to): Workflow, Process Builder, fields, page layouts, record types, custom settings, dashboards and reports Managing Service Org user profiles and data import Solid knowledge on the deployment process Ability to understand the business context of Commerce Cloud and Sales Cloud Ability to learn and adopt to the environment and scale up quickly to support the business critical processes. Knowledge on recovering the environment from the point of failure User training and creating training materials Ability to learn Salesforce Commerce and Marketing Cloud Lead development teams, providing technical guidance, mentoring, and oversight throughout the development lifecycle. Ensure adherence to best practices, coding standards, and quality assurance processes. Oversee the integration of Salesforce Sales Cloud with other internal and external systems and applications, ensuring adequate data flow, interoperability, and alignment with program requirements. Develop and implement integration strategies, API management, and data mapping Identify, communicate and proactively manage risk areas associated with solution design, track short term trade-offs, plan for remediation of technical debt and commit to seeing an issue through to complete resolution Advice and mentor Team members by providing guidance on application and integration best practices Conduct study into Client’s use of Salesforce Sales Cloud, producing a report of findings with recommended steps and roadmap for Salesforce solutions Act as Release Manager by reviewing and approving configuration and coded items to be deployed between environments. Where appropriate, this could include deploying changes.
Mosaic is looking for Dynamic F&O Functional analyst/ Techno-functional. Candidate would be working directly with Client CFO. Insurance/banking domain experience will be an advantage.
Minimum 8 years of professional experience in data science, analytics, or a similar role. Strong proficiency in data analysis, statistical modeling, and machine learning techniques. Expertise in programming languages such as Python, R, or similar for data manipulation, analysis, and modeling. Experience with data visualization tools (e.g., Tableau, Power BI) to effectively communicate insights. Experience with Dataiku is highly desired. Strong problem-solving skills and the ability to tackle complex business challenges using data-driven approaches. Excellent communication and collaboration skills with the ability to work effectively in cross-functional teams. Proven track record of delivering high-quality data solutions on time and within budget.
The Lead Services Engineer is responsible for technical leadership for building and managing API Governance tools for discovery, design, and lifecycle management. This individual serves as a lead resource and developer on the Digital Guest Experience team for supporting project workload requested by business/marketing and departments within Information Technology. The Services Engineer provides thought leadership and technical leadership for other services engineers as well as architects and business analysts on forward development and feature design/discovery
Additional Workday Adaptive Planning “sheet” enhancements, including: Adding functionality for blanket inflation factors (Adaptive Planning expertise) Adding functionality for FTE/Employee related promotions (Adaptive Planning expertise); primarily related to actuarial testing/raises but would need functionality available to all employees Enabling Single Sign-On (SSO) in Workday Adaptive Planning for all users; Currently setup to login based on email address alone (Workday Financials and HCM both use SSO) Design, build, and release to production Workday Financials integration to incorporate drill-down capabilities for Actuals within Workday Adaptive Planning (Both Adaptive Planning and Workday Financials expertise) We currently load all actuals data from Workday financials via an integration at a cost center/account level in Workday Adaptive Planning. Our next step is to build a connection between the two so that someone viewing a report in Workday Adaptive can “drill-down” on the actuals detail and understand exactly what is included within Workday financials. Design, build, and release to production Workday HCM integration to incorporate FTE level reporting within Workday Adaptive Planning and/or HCM (Both Adaptive Planning and Workday HCM expertise) Specific to FTEs, we currently load “Salaries & Wages” actuals data from Workday financials via an integration at a cost center/account level in Workday Adaptive (similar as above). Our next step is to build a connection between Workday Adaptive and Workday HCM to get FTE analysis and reporting at a more granular level. This could include loading the “budgeted” positions to HCM and building reports within HCM, or possibly loading the actual HCM information into Workday Adaptive Planning and building the reports there. Edit/Create various Workday Adaptive Planning reports/dashboards and/or Workday Financials/HCM related reports based on progress included above (Adaptive Planning, Workday Financial, and Workday HCM expertise)
5 + years of SAP HCM implementation, development, testing and support experience 5+ years of experience in SAP Personnel Administration, Org. Management, Time Management, Payroll implementation, ERP integration and support is a must. Integration with Time clock systems, time evaluation and time management schema and rules creation and updates. Strong in USA payroll , legal knowledge, Vacation premium, CFDI, Social Insurance and other legal reporting knowledge is a must. Good experience to modify/create payroll and time PCR and scheme. Strong knowledge in Time management like need to create work schedule rule. Holiday Ability to adapt well and perform in new competitive environment. Good team player, strong team orientation, hard worker and enthusiastic. Involved in the Unit testing, Regression Testing and Integration testing. Ability to resolve configuration and functional business issues across various HR modules in a hands-on manner. Able to convert Business requirements into Functional and Technical requirements. Experience with supporting various inbound and outbound interfaces from and to SAP HR system, troubleshooting, job set up and knowledge of file transfer is a must. Able to work independently in a dynamic and demanding environment and meeting delivery timelines. Demonstrated ability to work in a team under specified deadlines. Ability to work remotely and collaborate with teams using online collaboration tools
The Enterprise Data Warehouse (EDW) is the corporate repository of integrated operational data.  It is moving from on-premises management of it’s Data Warehouse and Business Intelligence to cloud Data Management and reporting.  Working with EIS and offshore resources, this role will lead the development of the cloud DW while supporting the current DW environment.    Interaction with Others:  External Contact:  This position will work with external data and system vendors to exploit cutting edge Data Management technologies and integrate/export data following security guidelines. Internal Contact: This position will have significant contact with Offshore development resources EIS physical IT resource management and Architecture staff Internal business sponsors and subject matter experts.
1. Extensive knowledge in CIS, billing processes, meter to cash processes 2. Strong Technical knowledge on Oracle database SQL, PLSQL, Unix shell scripting, Oracle forms, Automic job scheduler 3. Knowledge of Banner Product • 5-8 years in utility industry with extensive knowledge in CIS, billing processes, meter to cash processes • Strong business analysis skills and documentation skills (able to create a business requirements document, functional specification etc.) • Working knowledge on Banner product • Strong business analysis skills and documentation skills (able to create a business requirements document, functional specification etc.) • Strong knowledge of database technologies and willingness to learn multiple technologies including Pro*C, Shell scripting, Automic etc.