Discover JobsUnlock your career potential with Discover Jobs—a gateway to endless opportunities and professional growth.
7+years of experience required Performs Forms development & related activities on Oracle Documaker. Preferably experience on version 12 or above. Oracle Documaker Studio MRL development (DAL scripting, forms composition, unit testing) Proficient in Documaker rules & overflow concepts Forms Development (Static and Dynamic Forms) Build & apply rules (DAL Scripting) Data file Mapping & formatting Automation of Code promotion using scripting, Lbysync, Lbrymanage, Lbyproc. File-based library generation & ODBC code promotion. Experience working on Version Control systems, preferably GitHub. Knowledge on Jenkins, Team City etc., Experience with Docupresentment Additional programming languages/skillsets like SQL, Visual Basic scripting, DAL scripting, XML, JSON Should be proficient in building Complex documents like Statements and Policy Packets. Experience setting-up Documaker Jobs for Batch, Real-Time and Interactive document generation. Develops and maintains the business knowledge necessary to gain an understanding of the customer area. Resolve Design issues. Identifies design improvements, increase re-use, reduce redundant code. Unit Testing & artefacts Create Design documents Maintain requirements traceability matrix (RTM) Complete Trainings to equip on new features of Documaker.
· Overall 8 Years of Experience required · 3+ years of data science, data analytics, or business intelligence experience · Strong understanding of SQL · Experience with conducting data analysis in Python and / or R · Experience building clear and easy to understand dashboards and presentations · Ability to communicate clearly and effectively to cross functional partners of varying technical levels · Ability to define relevant metrics that can guide and influence stakeholders to appropriate and accurate insights
8+ years of relevant industry experience with a BS/Masters Experience with distributed processing technologies and frameworks, such as Hadoop, Spark, Kafka, and distributed storage systems (e.g., HDFS, S3) Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions Expertise with ETL schedulers such as Apache Airflow, Luigi, Oozie, AWS Glue or similar frameworks Solid understanding of data warehousing concepts and hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and columnar databases (e.g., Redshift, BigQuery, HBase, ClickHouse) Excellent written and verbal communication skills
Analyzing, designing, programming, debugging and modifying software enhancements and internal data projects. Interact with internal teams to define system requirements and/or necessary modifications. Participate in software design meetings and analyze user needs to determine technical requirements. Write technical specifications based on conceptual design and stated business requirements.
1) Must have an ISTQB certification 2. A strong understanding of QA principles. The different types of testing (functional, system, integration, regression, UAT), how to develop test cases, suites, write bug reports understand the movement of requirements, builds, testing deliverables 3) 3+ years in Testing techniques such as Boundary Value Analysis (BVA), Equivalence Class Partitioning, Decision Table based testing., State Transition, Use Case, Error Guessing, and Exploratory testing. 4)Strong understanding of the different test levels (Unit, Component, System, System Integration, Acceptance) 5) Must have hardware testing experience - Printers/end user devices. More web and mobile devices testing (includes cellphone)
As senior developer responsible for leading technical aspects of Adobe Experience Cloud implementations including micros sites, DAM repositories, Content-As-A-Service and system integrations development. Work with technology, product and business partners to design and develop reusable template, components and services Implements Java Script Frameworks, Rest Services and Integrations across digital content management portfolio. Collaborate with enterprise architecture, QA , InfoSec and other cross-functional teams to build scalable , secured and reliable content products and solutions Implement integrations across digital content platform (eg: AEM, Analytics, Target etc, ) Work collaboratively with QA, DevOPS teams to adopt CI/CD tool chain and test automation Contribute to, and participate in, projects to ensure successful implementation of defined digital experience solutions
BS/BA/MS in Computer Science, Information Systems, Life Sciences, or a related Computational Field 10+ years of BA experience with a Technical mindset. 5+ years of experience in Pharma manufacturing division (Preferably at Merck) Experience working with data from manufacturing systems like MES, Enterprise PI Hands on experience authoring Requirement Specification, Quality Assurance Protocols, Data mapping, Validation documentation, Operational Qualification Protocols, Performance Qualification protocols., SOPs, Work Instructions Experienced professional working with Client Business Teams to understand needs, presenting approaches, process flows etc., working with IT teams to communicate expectations, development of requirement specifications etc. Excellent written and verbal communication Experience in developing Technology/Capability specific communication materials to build awareness among user community, senior leadership etc. Ability to plan for, coordinate, and conduct reviews of project deliverables for completeness, quality, and compliance with established project standards
Skill: Software Testing, ALM, Manual, Automatic Testing Candidate should be BE/BTech/MCA graduate 8 - 12 years of Working Experience in Manual Testing Good knowledge of ALM,Manual Testing, Automated Testing Should have hands on Experience in Analyse and write Test case/Test Scripts from Requirement SpecificationInvestigate and determine the cause of issues Handle UAT/SIT and execute Test cases Excellent Communication skills with business acumen is required. working knowledge in Testing Tools is prefered. Pharma Domain is desired."
5-7 experience with ETL or data integration Hands-on experience with Data Modeling/Dimensional Modeling Knowledge of Kimball and Inmon Data warehouse methodologies (Star Schema and Snowflake Schema methodologies) Extensive experience with ETL Tools Strong SQL free hand experience Strong Oracle ERP (HCM and finance modules - sourcing data out of oracle ERP into data warehouse) Experience with Tableau administration
Experience in managing DNS systems, specifically QIP and Network Troubleshooting (90% job related to troubleshooting) Experience manually changing DHCP Packet capture Information(Net Scaler) Experience working with cloud technologies, specifically Azure Experience working with DDI Support, configure, upgrade, and maintain customer in-house servers, workstations, and network Administer third-party applications Create and deploy new workstations for employees Provide support to end users Installing and configuring network equipment to update or fix hardware or software issues internally
Bayer has too many data sources globally with overlapping data with no effective mechanism to pull the data together for effective and timely business decisions. Goal of the project is to create common platforms to consolidate and deliver global data that can be acted on by Bayer internal and external stakeholders. Combining data assets (people, hierarchies, transactions, accounting) from several sources and exposing it to consumers through APIs, Kafka topics, and through an interactive data warehouse. The SCALA developer will support the "Events 360" team, report to a LEAD engineer. The Scala Developer will also support an API based team that is working to provide endpoints to customers throughout the organization. This will be REAL TIME company-based data that can identify a customer's purchase history, along with other backend metrics. This person to be able to jump into any Kafka issues, bugs, enhancement development, 50 applications/250 endpoints. Bayer Corp. uses the most AWS space in the market, and the work environment is very innovative.
· Bachelor's in Computer Science or related degree or higher. · 5+ years of experience of development in .NET using C# · Strong working knowledge of object orient concepts using C# · Prior experience of automation tools like Selenium or Appium with C#.NET · Ability to work independently in a large team environment.
- 8+ years of hands on JAVA development experience- must be shown in the resume (candidates with less than 8 years will be disqualified) - Experience working with Angular2+ (team is currently working with Angular8) - Strong background working in an AWS environment - TFS or Jira experience Nice to Have Qualifications: - Excellent communication and soft skills- able to work in a team environment and bring ideas to the table (not just a heads down role) - Experience leading or mentoring other junior members on the team - EC2 and S3 experience
The Senior Engineering Manager role will oversee a team responsible for defining, building, and delivering technical solutions for the Hudson’s Bay Digital Technology team. They will maintain the long-term technical vision alongside the day-to-day execution of new features on multiple eCommerce sites.They will work closely with Product and Project Managers, Enterprise Architects, and Engineers, in order to ensure a clear understanding of the solution and a smooth delivery. This position reports directly to the Senior Engineering Manager, Digital.
Equipment Order Re-Factoring Investigate, choose, and implement a process for equipment orders which better suits our current business requirements. The solution should provide more flexibility in our manufacturing process while providing flexibility with our customer’s changing demands. Built when implemented and it has not changed much Using templates and kitting within the templates Problem: Customer’s push out orders So one customer is not ready but another is, so want to move equipment over but it is not that easy
Work closely with the Development, Operations and Release Management teams. Automate, implement, administer, and maintain, continuous development and continuous deployment systems In depth knowledge of build and deployment automation technologies An attitude and ability to take ownership and deliver a high-quality product, on time A strong bias for action, and shipping product Process-oriented Experience with Github, Rundeck, Jenkins Pipelines, Cloudbees, Redis, Siteminder, Apache, Solr, wso2, Splunk, Azure AP Manager, Saucelabs, Ansible, Production Monitoring/Troubleshooting and F5 2+ years of containerization technology, with the skills to configure, and manage container platforms 2+ years of work experience with a strong knowledge of continuous integration and continuous deployment Experience of implementing DevOps in multiple companies An appreciation and understanding of DevOps architecture, tools, and best practices A dedication to staying abreast of tools and technologies in the DevOps space Experience of continuous build automation, with the skills to configure and manage a Jenkins build server and set up and run build pipelines
Key Objectives: · To be part of the initial project delivery team in order to architect the solution design and de-risk the SAP GTS project. · Analysis of business requirements, work alongside SAP on system design. · Design and implementation of full SAP GTS solutions. · Identify integration issues and develop solutions to these issues. · Coach and develop skills within a project team. · Work effectively in mixed Client and Consultancy teams. Skills/Knowledge/Experience: · Extensive knowledge of SAP GTS specifically · Restricted-party screening · Export management · Import management · Multiple implementation experience with SAP GTS · Extensive SAP implementation experience. · Strong client facing Consultancy experience. · Minimum of 4 project life cycle implementations. · Language skills or project experience outside the US or UK – Essential. · Experience with Global Multinational Organizations and Global Rollouts– Desirable.
Has good understating of Avaya IP telephony and VOIP terminologies/technologies like H.323, H.248, SIP, SBC’s, T1 and E1 concepts. Knowledge of WAN, Routing, Switching, IPV4 and IPVG and IP services. Experience on Multi-Vendor OEM for Voice, Video & Call Center application integration. Monitor and ensure availability of Contact Center and Enterprise Telephony applications. Provide inputs for capacity management. Has good understanding of ITIL process and perform Incident, Problem and Change management. Perform Root Cause Analysis and perform procedures to eliminate the cause. Maintain and contribute to troubleshooting knowledgebase Perform trend analysis Add/Remove, test and deploy changes to features supported in the Contact Center and Telephony applications. Co-ordinate with OEM vendor to upgrade and patch the listed contact center applications Continuously review/Update operational documentation for Contact Center. Perform proactive monitoring, upgrade and patch management. Assist implementation team for new configurations and projects by providing inputs from steady state perspective. Identify appropriate support tool and recommend during implementation Identify components for proactive monitoring and preventive maintenance with health check periodically and also document the process/steps
· 6+ years of experience in EPM applications and support · Hyperion HFM/Planning/Essbase Infrastructure related experience - Installation and configuration and understanding its compatibility platform · Hyperion Infrastructure related experience - Installation & configuration on Windows platform · Experience in Data Migration, Upgrade and Support · Liaising with Oracle on Product issues · Excellent social and communication skills · Experience in Quality Testing of the artifacts and reports
Possess extensive analysis, design and development experience in Hadoop and AWS Big Data platforms Able to critically inspect and analyze large, complex, multi-dimensional data sets in Big Data platforms Experience with Big Data technologies, distributed file systems, Hadoop, HDFS, Hive, and Hbase Define and execute appropriate steps to validate various data feeds to and from the organization Collaborate with business partners to gain in-depth understanding of data requirements and desired business outcomes Create scripts to extract, transfer, transform, load, and analyze data residing in Hadoop and RDBMS including Oracle and Teradata Design, implement, and load table structures in Hadoop and RDBMS including Oracle and Teradata to facilitate detailed data analysis Participate in user acceptance testing in a fast-paced Agile development environment Troubleshoot data issues and work creatively and analytically to solve problems and design solutions Create documentation to clearly articulate designs, use cases, test results, and deliverables to varied audiences Create executive-level presentations and status reports Under general supervision, manage priorities for multiple projects simultaneously while meeting published deadlines Bachelor's degree or Master's degree in Computer Science or equivalent work experience Highly proficient and extensive experience working with relational databases, particularly Oracle and Teradata Excellent working knowledge of UNIX-based systems Excellent Interpersonal, written, and verbal communication skills Very proficient in the use of Microsoft Office or G Suite productivity tools Experience with designing solutions and implementing IT projects Exposure to DevOps, Agile Methodology, CI/CD methods and tools, e.g. JIRA, Jenkins, is a huge plus Prior work experience in a telecommunications environment is a huge plus Experience with Spark, Scala, R, and Python is a huge plus Experience with BI visualization tools such as Tableau and Qlik is a plus Background in financial reporting, financial planning, budgeting, ERP (Enterprise Resource Planning) is a plus Exposure to advanced analytics tools and techniques e.g. machine learning, predictive modeling is a plus.