Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities — we're just getting started.
At Facebook, we have many opportunities to work with data each and every day. How would you like to work on data and build some of the tools that are critical to moving & transforming this data into valuable & insightful information? If so, this is the right job for you.
Our Enterprise Data Warehouse team works very closely with all aspects of data, both internal and external. We are looking for a Data Engineer with the Software Engineering chops to not only build data pipelines to efficiently and reliably move data across systems, but also to build the next generation of data tools to enable us to take full advantage of this data. In this role, your work will broadly influence the company's data consumers and analysts. You will get the opportunity to work with focused and scaled objectives in a company that has some of the most challenging problems to tackle.
This is a full-time position based in our office in Singapore.
Build data expertise and own data quality for the awesome pipelines you build
Architect, build and launch new data models that provide intuitive analytics to your customers
Design, build and launch extremely efficient & reliable data pipelines to move data (both large and small amounts) to our ridiculously large Data Warehouse
Design and develop new systems and tools to enable folks to consume and understand data faster
Use your expert coding skills across a number of languages from Python, Java and PHP
You have developed applications within the LAMP Stack environment
Work across multiple teams in high visibility roles and own the solution end-to-end
2+ years of Java and/or Python development experience is necessary
2+ years of SQL (Oracle, Vertica, Hive, etc) experience is required
2+ years of LAMP stack development experience is necessary
1+ years of experience with dimensional data modeling & schema design in Data Warehouses
Experience working with either a Map Reduce or a MPP system on any size/scale
Ability to write reusable code components
Communication skills including the ability to identify and communicate data driven insights
BS or MS degree in Computer Science or a related technical field