Job Description:Over the past 25 years, With Intelligence has evolved from a traditional financial publisher into a dynamic, product-led fintech company. Our mission is to empower investors and managers worldwide by connecting them to the people and data they need to raise and allocate assets efficiently.We have recently secured a new round of funding from a prominent technology investor. This investment will drive our committed plan to elevate our product into a pioneering, market-leading platform.We are rapidly expanding our focus on data, both internally and within our products and services. We are now looking for more help in this area to keep up with the growing demands of a dynamic, data driven organisation. We need someone that wants to get stuck in and do stuff! The role requires knowledge and understanding of some data and technical concepts and tools, but we do not expect candidates to have a lot of professional experience in all these areas. This role will enable you to get hands on experience in multiple areas, making it an amazing opportunity to learn very quickly.Responsibilites
Develop, maintain, and optimise ETL processes for data extraction, transformation, and loading
Create and manage data models and data warehousing solutions
Write and maintain structured queries as well as content scraping projects
Utilise programming languages like Python, SQL for data processing tasks
Enablement of integration of apps / services and connecting to internal and third-party APIs
Collaborate with cross-functional teams to ensure seamless integration of data processes
Optimise data pipelines for performance and efficiency.
Work closely with data scientists and analysts to support their data needs.
Build pipelines to transform raw, unstructured data into useful information by leveraging AI and LLMs
Requirements:
Proven experience in data engineering and proficient in designing and implementing scalable data architectures.
Strong experience with ETL processes, data modelling, and data warehousing (we use airflow, dbt and redshift)
Expertise in database technologies, both relational (SQL) and NoSQL.
Knowledge of cloud platforms (AWS)
Solid understanding of data security measures and compliance standards.
Excellent Python experience
Collaborative skills to work closely with data scientists and analysts
Ability to optimise data pipelines for performance and efficiency.
Ability to build, test and maintain tasks and projects
Experience with version control systems, like Git.
It would be nice if you had:
Experience with Airflow and/or dbt
Experience working in Agile environment using SCRUM/Kanban
Hands-on experience working within a DevOps environment
Benefits:BenefitsAnnual Leave
22 days per calendar year (and all local bank holidays)
An additional day off for your birthday
Wellness and Care Leave
Up to 5 days for self-care or wellness
Volunteer Day
1 day off to support a charity of your choice
Share With Scheme
Eligible employees receive a share in a qualifying event
Therapy Sessions
50% contribution towards therapy sessions, up to BGN 90 per session
Lunch & Learns
Events held throughout the year with educational or informative topics
Charity Matching Days
Company matches charity sponsorships up to 3500 BGN
Hybrid Working
3 days in the office, 2 days working from home
Multisport
50% contribution from the employer, 50% from employee
State Benefits
Pension & health provision
Family-friendly policy
Office Facilities
Contemporary office space
Free Onsite gym
Address: 51, "Cherni vrah" Blvd, 1407 Sofia, WorkBetter Coworking space
Employee Assistance Programme (EAP)
24-hour confidential health assistance via TelusHealth, including:
Counselling support
Financial wellbeing
Bereavement support
Legal information
Medical information
Refer a Friend
BGN 1,200 reward for successful referral
Hardship Fund
Financial assistance repayable with low interest over 3, 6, or 12 months
Learn With Us
Access to a learning platform with over 80,000 free courses