Xandr Cares Jobs Jobs
Xandr Cares Jobs
Description
The Clearing House Engineering Team at Xandr is responsible for processing huge volumes of data through the platform and accounting for every penny attributed to our buyers, sellers, and third parties. In addition to the team's ownership of the billing data pipeline, the team owns various custom data processing systems, APIs, and UIs that power client-facing financial interactions.The Clearing House team is looking for a Software Engineer to help drive and scale the billing and financial clearing systems at Xandr. The role requires technical prowess in or ability to quickly learn a multitude of technologies. We foster an environment of learning, so being eager to learn and tackle new problems is integral to the role. The role will require working collaboratively within the team and across teams, as well as the ability to take on ownership of products, projects, and technology.
About the job:
- Design and develop reliable, scalable and testable big data jobs in an ecosystem centered around Hadoop
- Work capably as a full-stack engineer supporting a variety of application stacks across many mission-critical applications
- Participate in all stages of the SDLC, from design and development to deployment and support.
- Work closely with product owners and stakeholders to ensure technical design meets business needs present and future
- Work collaboratively with diverse teams around the company to build solutions scaling across competencies and technologies
Qualifications
- BA/BS or MS degree in Computer Science or related field- 3+ years of experience in software engineering
- Proficient in Java and at least one more language such as Python, C/C++, Javascript, Scala
- Knowledgeable with RESTful API’s and at least one API framework (some examples we currently use, but any framework knowledge is positive: Dropwizard, Play Framework, Flask, Node Hapi)
- Comfortable working with databases from a design and data retrieval standpoint, especially SQL databases such as MySQL or PostgreSQL
Nice to have skills:
- Experience with big data processing technologies such as Hadoop or Spark
- Experience working on scaled real-time systems in C/C++
- Comfortable working with Unix environvments and shell/bash scripting
- Experience working on Docker/Kubernetes application build and deploy environments
- Experience with data streaming technologies such as Kafka or Lambda Architecture
More about you:
- You are passionate about a culture of learning and teaching. You love challenging yourself to constantly improve, and sharing your knowledge to empower others
- You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen
- You care about solving big, systemic problems. You look beyond the surface to understand root causes so that you can build long-term solutions for the whole ecosystem
- You believe in not only serving customers, but also empowering them by providing knowledge and tools
Talent Network
Sign up to receive a monthly newsletter and be the first to know about career events, new openings and exclusive updates.
Big Data Intern
Pros
Good experience overall. Colleagues were very helpful. Atmosphere was chilled out and there was no rush to complete the project. Was given ample amount of time to understand the project and contribute.
Cons
People are laid back and don't take initiative to do something new or optimize existing stuff. It's not primarily engineering company so if you know how to talk, you will go far longer in career.
Current Employee - Principal Business Manager
- One Star Rating
- Two Star Rating
- Three Star Rating
- Four Star Rating

LifeAtATT
This is the life – the #LifeAtATT, that is. We’re creating what’s next and having a blast doing it. You’re looking for proof? Well, see for yourself.