Discover JobsUnlock your career potential with Discover Jobs—a gateway to endless opportunities and professional growth.
Overview Kreative Technologies is in search for a motivated Senior Software Engineer with leadership qualities to help support Military Health Applications for the Department of Defense (DoD). The candidate must possess working knowledge in a variety of Information Technology (IT) infrastructures and enterprise system technologies supporting a virtual operating environment. The software engineer will be expected to develop in a fastpaced environment with leading technologies. to include HL7 Framework OpenLink SOAP JSON Java XCA C# .Net TSQL MS SQL Server Visual Studio Professional GIT Experience 5 years of experience developing interfaces 5 years of experience with ETL Familiarity with DoD Information Security Policies, Information Assurance practices, and Assessment & Authorization (A&A) requirements. Familiarity with network architectures to include intercommunications between servers/systems required by applications.
As a member of the M&A team, this Program Manager will be responsible for driving and collaborating with cross functional teams on the successful execution & completion of migrating an organization from Salesforce onto Adobe’s platform on Microsoft Dynamics and SAP ECC (for Financials)
Job Description 3+ years of experience in Wireless Network core technology and components Advanced knowledge in reporting tools like Tableau, Splunk, ELK, Grafana (must have advanced skills in one of these tools) Expertise in Linux command line (must have) Familiarity with WiFi Standards (802.11) and features Coding skills Python Intermediate (must have) SQL Intermediate/Advance skills (must have) DBA fundementals Talend programming application experience (must have) Networking fundumentals, L2 & IP Networking DOCSIS Plant & Networking Basics
WebLogic Middleware Management Python & Oracle Databases DevOps Practices and Tools
Position Summary This role is with the Enterprise Integration Services (EIS) division of H&R Block (Alanna's team, Client Data Services, falls within this business unit), which is the primary integration partner for WorkCenter (the application that HR Block tax pros use in retail locations). Data from WorkCenter is transmitted to back end service layer, these services include User Authentication. This is new dev and enhancements. Skills needed are 7+ years in Java Development (1.8), Spring, Spring MVC, Agile. REST services and/or microservices experience required. She would like to see experience in newer tech and tools like microservices, ELK, AWS etc. but those are nice to haves. Java SW Engineer Java 8, Any ELK Stack would be ideal Contract thru the end of this year
• .NET Development: 5+ years of experience with .NET version 4.3. • Production Support: Experience monitoring platforms and providing production support. • Monitoring Tools: Familiarity with New Relic or similar tools. • Networking: Solid understanding of networking principles. • CI/CD Pipelines: Experience working with CI/CD pipelines. • Cloud Computing: Preferably with Google Cloud Platform (GCP). • Source Control: Proficient in using GIT for source control. • Web Services: Experience with web services and understanding of internet traffic flow. • Frontend Development: Basic experience with Angular or React.
Description Job Title Senior DevOps Engineer Partner Wed Services (PWS) Autodesk is seeking a motivated and experienced DevOps engineer to join our Partner Web Services (PWS) team. You'll be responsible for building processes and automation that increases our ability to deliver applications and services at high velocity Responsibilities Work in an agile environment collaborating with the team to automate engineering processes in quick iterations Apply continuous attention to technical excellence and good design principles resulting in a scalable, reliable, performant, and maintainable infrastructure Work closely with product owners, engineers, test engineers to understand requirements, define best practices and standards around DevOps and Service Resilience Build and maintain monitoring, resilience, testing, rollback tools to maintain services health and uptime Install, secure, monitor, manage, and maintain critical platform infrastructure in a cloud environment Track incidents and participate in active on call activities to help resolve service issues on a timely manner Basic Qualifications Bachelors degree or higher in Computer Science, Engineering, or related field 7+ years of strong handson experience in Dev Ops build and/or environment automation Handson experience with AWS (CloudFormation, EC2, Lambda, S3, RDS, VPC, Route 53, CloudWatch, CloudTrail, IAM, etc.) Handson experience building CI/CD pipelines using Git, Jenkins, Docker, Ansible, Chef/Puppet, etc. Handson experience with deployments techniques for like Blue/Green deployments for zero downtime Proficiency with at least one of these automation scripting languages Java, Python, Ruby, JavaScript, Unix Shell, Groovy. Strong experience building monitors & alerts in a highly clustered cloud computing environment Deep understanding of security best practices and standards around cloud computing and access management Solid understanding of fundamental technologies like DNS, load balancing, TCP/IP, SSL, NAT, DHCP, etc. Possess strong verbal and written communication skills Possess strong analytical skills with excellent problemsolving abilities Must be extremely detailoriented with respect to documentation and communication Preferred Qualifications Familiar with Agile/Scrum, continuous integration/delivery, and modern development practices Experience working with distributed systems that uses technologies like Ruby, Apigee, ElasticSearch, Node JS, PostgreSQL, Zookeeper, etc. Experience with secrets management tools like Vault and AWS Parameter Store Experience integrating with application performance management systems like Dynatrace, New Relic, Appdynamics, etc. Experience working with logging and monitoring platforms like Splunk, ELK stack, etc. Experience with Amazon Web Services. Public cloud providers like Heroku is nice to have Disaster recovery planning and experience with high availability systems in a cloud environment
Key Responsibilities
Key Responsibilities
Must Have
Job Description Must Have Required Skillset Proficient in Postgres and 1 NoSQL {for ex. Couchbase, Cassandra, MongoDB}. Process automation experience (Ansible) Nice To Have Proficient in Scripting language for automation (eg. Perl, Python and Shell) Possess strong analytical and troubleshooting skills A true passion for your chosen field b) Nice to have MySQL and additional NoSQL experience (Couchbase, Cassandra, MongoDB) Experience in a virtual environment, including but not limited to VMWare, OpenStack or Amazon. c) 46 years of experience
The Senior Data Analyst position will be responsible for the management, design, testing and implementation of Data Warehouse projects for Policy, Claims, Quotes, Billing, Agency, ancillary components and application data. Primary duties will include requirements review, estimating, analysis, data modeling, development of technical specification documents, development team support and guidance, significant quality assurance, implementation tracking, downstream application support and monitoring of data in production databases. This role also requires the ability to internally manage the project from Analysis through Post-Implementation phases. One key initiative that will be in the near future for this role is participation of migrating and modernizing the current day Data Warehouse; migrating it to AWS within Snowflake/RedShift, so knowledge and experience in this area is desired. This role will report to the Data Warehouse Manager, but on some projects may report directly to the Director.
Serve as the team’s technical subject-matter expert and a resource for technical escalation. Work with business counterparts to understand the business requirements and provide a robust, higher-level technical solution. Manage and lead Data Engineers team in delivering data warehousing related solutions. Experience in leading discovery sessions Experience in integration technologies (for ex. DataStage, Matillion) Experience in building reporting solution (for ex. ThoughtSpot, Cognos, Power BI, Tableau) Experience in iterative and/or agile and/or scrum development. Identify potential issues and implement solutions proactively. Problem-solving/ analytical thinking. Experience in dealing with teams across different time zones, good teamwork, and collaboration. Engage with senior managers regularly, reporting on project status, activities, and achievements.
Should be strong in Tableau Resource should be able to visualize and explain the Data Models. Should be able to compare and validate the differences. Should be strong in Excel Should be strong in SQL
4+ years of enterprise Big Data platform administration experience using Hadoop (Cloudera or equivalent) Enterprise big data security and management operations using tools / services such as Sentry and Kerberos Exposure to the Hadoop ecosystem [Cloudera distribution]. Exposure to components such as HDFS, Spark, Sqoop, Oozie, Flume, Hive, Impala, MapReduce, Sentry, Navigator Experienced in configuration of 3rd party systems such as Informatica Big Data Edition / Arcadia / Trifacta / Attunity preferred Experience working on Unix / Linux environment, as well as Windows environment Experience on Java or Scala or Python in addition to exposure to creating shell scripts for automating tasks / housekeeping
Provide functional and technical support to OBIEE reporting tool for enhancements, reimplementation and break fixes Supporting leadership dashboards and Functional excellence dashboards in OBIEE and support Talen Management Dashboards in HCM Cloud. Provide technical support for Oracle OACS application and Autonomous Data Warehouse Cloud Service (ADWC) Validate technical deliveries from IT team, perform quality check before user testing Assist HRIS team in requirement gathering for OBIEE and OACS reports and documenting requirements using Cummins templates. Act as a liaison between HRIS and HRIT team for functional and technical discussions Assist users in testing, documentation Create test cases, test scenarios, test plans for internal and user testing Create functional and technical documentation, training documentation, job aid, etc. Produce data for audits using SQL and work on complete OneSource audits which are request by IT Compliance on a quarterly basis.
Responsibilities Expertise in Elastic Cloud Enterprise Working with enterprise architects to advise on best practices for Kafka & Elasticsearch both on prem and in the cloud. Design and implement DR solutions for Confluent and Apache Kafka and Elasticsearch. Modernizing ETL pipelines through the use of Kafka and Elastic. Responsible for Kafka tuning, capacity planning, disaster recovery, replication, and troubleshooting. Implementing Kafka security, limiting bandwidth usage, enforcing client quotas, backup and restoration. Indepth understanding of the internals of Kafka cluster management, Zookeeper, partitioning, schema registry, topic replication and mirroring Architect, configure, deploy and maintain Elasticsearch clusters. Configure Logstash and Beats to collect data necessary to meet client requirements. Configure XPack plugins to include Security, Watcher, Machine Learning, Monitoring, Graph, and Reporting. Design, implement, and configure Kibana visualizations and dashboards. Maintaining Kafka connectors to move data between systems. Experience training, mentoring and leading an emerging team is beneficial Knowledge of Cloud Formation/Terraform scripts. Work closely with Big Data and AWS cloud technology groups. Experience with open source Kafka distributions as well as enterprise Kafka products preferred. Familiarity with both cloud native Kafka (on AWS) and onpremise architectures. Managing Kafka & Elasticsearch clusters & creating tools to automate and improve clusters' reliability and performance.
Musthaves 5+ years of experience with C# Experience with SQL Server backend Experience with Web based applications Cloud experience (Azure preferred, AWS ok) Angular experience Plusses Microservices experience Bachelors Degree DaytoDay TTofT in Houston, TX is seeking 4 full stack developers for a client in the Galleria area. These developers will be joining a team of 3 full time developers and replacing a current offshore team. The team of developers will be doing mostly new development for 3 main portals and supporting those portals through all phases of the development cycle. The team will attend daily scrum meetings and work in 2 week sprints on all reports to make sure that all deadlines are met. The user interface is written in Angular and this team is responsible for developing a user friendly portal written in C# and supporting the portal until the GO LIVE date of 11/15/19.
AEM Back End Developer Job Description Software developer with at least 5 years of experience with backend AEM experience in an enterprise environment. Experience should include creating custom components to display content on the frontend, ability to provide redirects when needed, search, and apply content filters on a page via tags, and filters persist across pages.
4+ years' experience in marketing, digital, or sales operations. Minimum of 3 years' experience implementing automated email journeys in Marketo Proficient with Marketo data model and standard Salesforce integration Prior experience with Sales Insights, Bizible, Event Management, Digital Advertising, and Munchkin Deep understanding of CSS and mobile-first HTML email development including Velocity Template Language (VTL) and Javascript Familiarity with email design and deliverability best practices Experience building and testing user journeys and creating in-depth test plans and strategies End-to-end development experience of email and forms/landing pages from creation to deployment and reporting Proven expertise in devising innovative solutions to address business marketing requirements utilizing Marketo Proactive, self-starter with drive to build first-class consumer experience Ability to train digital marketing team and IT team on Marketo and Sales Insights Excellent verbal and written communication skills Work directly with vendor, client development team, and business Agile/Scrum experience is preferable