Freelance sql jobs for professional developers! You do not need to waste your precious time searching for work. See salaries, easily apply, and get hired! Landing high paying sql freelance jobs can be difficult, but it does not have to be. Discover your next career move with Vollna! Freelance sql jobs online offer you a great option for working remotely from your home. Create an account with us now for free!
Job Title | Budget | ||||
---|---|---|---|---|---|
Vector Database Query Execution Program Development
|
100 USD | 12 hours ago |
Client Rank
- Medium
$100 total spent
1 hires
USA
|
||
Job Description:
I am looking for an experienced Java developer with database systems expertise to complete Task 8 of a university project related to a Vector Database Management System (DBMS). Tasks 1-7 have already been completed, and all necessary files will be provided. Your responsibility is only to implement Task 8: Query Specification and Execution. Project Details: The project involves working with MiniBase, a simplified relational DBMS. The goal is to extend it to support efficient nearest-neighbor and range queries on a 100-dimensional vector dataset. Task 8 Overview – Query Specification and Execution Implement a command-line program query that executes queries on the database. The program should support: Range Queries: Retrieve all vectors within a given distance from a target vector. Top-K Nearest Neighbor Queries: Retrieve the K closest vectors to a target. The query input comes from an ASCII file (QSNAME) specifying the query type, target vector, and parameters. The program should be able to: Execute queries with or without an LSH-forest index. Use a specified number of buffer pages (NUMBUF) to optimize query execution. Report the number of disk pages read and written during query execution. Required Skills: Strong Java programming skills Experience with database management systems (DBMS) Knowledge of query processing and indexing (LSH-forest preferred) Familiarity with MiniBase or similar database frameworks Ability to work with disk-based query execution Deliverables: Java source code for the query program with proper documentation. Executable program that processes the provided test queries. Performance statistics, including the number of disk pages accessed. Clear instructions on how to compile and run the program.
Skills: MySQL, SQL, Microsoft SQL Server, Database Design, Python
Fixed budget:
100 USD
12 hours ago
|
|||||
Project report
|
not specified | 12 hours ago |
Client Rank
- Risky
|
||
I am looking for some one who can analyse my data in excel using chi square test and generate project report.
Skills: Tableau, Python, Microsoft Excel, Statistics, Data Analysis, Data Visualization, SQL, Data Science, Machine Learning, Data Scraping, Data Mining, Data Modeling, Data Analytics & Visualization Software
Budget:
not specified
12 hours ago
|
|||||
Freelance Data Analyst Contractor
|
not specified | 10 hours ago |
Client Rank
- Medium
$698 total spent
1 hires
1 jobs posted
100% hire rate,
open job
|
||
Featured
Freelance Data Analyst Contractor
US-Based Retail Clothing Business We are a US-based retail clothing business seeking a talented and experienced Freelance Data Analyst Contractor for a 6-month engagement. Job Responsibilities: Build analytical reports and data infrastructure across various business domains, including purchasing, customer order management, accounting, pricing, product catalog, helpdesk management, and more. Collaborate with different business functions to understand their data needs and provide insights. Develop data dashboards and visualizations to support business decision-making. Ensure data integrity and accuracy through rigorous validation and testing. Qualifications: Strong technical skills in computer software for data analysis (e.g., SQL, Excel, Tableau, Power BI). Basic to medium level coding ability (e.g., Python, R). Proven experience in data analysis and report building. Excellent problem-solving and analytical skills. Strong communication skills to present findings clearly and concisely. Contract Details: Duration: Approximately 6 months Position: Freelance contractor Location: Remote If you are a data-savvy professional with a knack for uncovering insights and driving business improvements, we would love to hear from you!
Skills: Python, SQL, Operations Analytics, Dashboard, Analytical Presentation, Data Analytics, Microsoft Power BI Development, Excel Macros, Data Visualization
Budget:
not specified
10 hours ago
|
|||||
Integration Specialist Needed for Fingertech Attendance Data with Business Central
|
1,000 USD | 10 hours ago |
Client Rank
- Good
$4'492 total spent
1 hires
|
||
**Job Description**
We are looking for an experienced Integration Specialist to connect FingerTech attendance data with Business Central and implement WorkForce Tracker: Employee Attendance & Incentives Management. Key Responsibilities: -Integrate FingerTech attendance data with Business Central -Calculate employee overtime -Calculate the Driver Trip Bonus -Calculate the Batcher & Loader Incentive -Track employee multitasking incentives -Implement an approval workflow for supervisors to review and approve the above information
Skills: Data Entry, Microsoft Excel, Web Development, SQL, WordPress
Fixed budget:
1,000 USD
10 hours ago
|
|||||
Task # 3
|
not specified | 9 hours ago |
Client Rank
- Risky
$1'360 total spent
1 hires
1 jobs posted
100% hire rate,
open job
0.00
of 1 reviews
USA
|
||
Task # 3 from earlier project.
Note: Subham, can you solve the task# 3? I still want to work with your team like with Nirav and the team. If so, can you tell me cost?
Skills: Microsoft Power BI, Tableau, Snowflake, GraphQL, SQL, API, PostgreSQL, Microsoft SQL Server, Apache Hadoop, Microsoft Power Automate Administration, Facebook Ads Manager, Adobe Photoshop, Google Merchant Center
Budget:
not specified
9 hours ago
|
|||||
Web Scraping Expert Needed for E-commerce Data Extraction
|
not specified | 9 hours ago |
Client Rank
- Risky
|
||
I am seeking a skilled web scraping professional to extract data from a Vietnamese e-commerce website. The ideal candidate should have experience with web scraping tools and techniques, as well as a solid understanding of data extraction from dynamic web pages. Your work will include gathering product information, prices, and other relevant details to help us analyze the e-commerce market. If you have a proven track record in web scraping and can deliver clean, structured data, please apply!
Skills: SQL, Data Scraping, Data Entry, Data Mining, Scrapy, Python, Selenium, Selenium WebDriver
Budget:
not specified
9 hours ago
|
|||||
Deleting google search result
|
not specified | 8 hours ago |
Client Rank
- Risky
|
||
Hello!
I'd like to personally invite you to apply to my job. Please review the job post and apply if you're available. https://en.autoconsultant.com.ua/automobile/audi/a4/e47fb9d5e1af365-2024-audi-a4-waueaaf41rn010519/ I need to delete all the google searches towards the auction car that pop up after you search the vin number of the car. H I
Skills: Information Security, SOC 2 Report, ISO 27001, Governance, Risk Management & Compliance, Compliance, Information Security Audit, Data Privacy, NIST SP 800-53, GDPR, HIPAA, NIST Cybersecurity Framework, Security Framework, Microsoft SQL Server, Risk Management, CMMC
Budget:
not specified
8 hours ago
|
|||||
Migration of Stackblitz and Supabase to local windows machine, and local db in Postgres
|
200 USD | 7 hours ago |
Client Rank
- Excellent
$2'106 total spent
28 hires
17 jobs posted
100% hire rate,
open job
4.99
of 22 reviews
|
||
I have a working 5 pages app built using StackBlitz and Supabase, but I need to migrate the database to my local PostgreSQL instance on a Windows machine. Along with the migration, I need modifications to the code to ensure the app runs smoothly locally.
Scope of Work: Database Migration: Move the existing Supabase database to a local PostgreSQL instance. Ensure the database is properly structured and importable without errors. Code Adjustments & Cleanup: Modify the app code to connect to the local database instead of Supabase. Ensure user authentication and data processing work locally. Clean and organize the code for better maintainability. Installation & Setup Guide: Provide a detailed step-by-step guide to set up and run the app locally. Include instructions for database import, environment setup, and execution. Milestones & Budget ($200 Total): Milestone 1 ($30): Proof of database import with all existing data working in local PostgreSQL. Milestone 2 ($40): Login screen and user management functionality working locally. Milestone 3 ($50): Data entry and calculations working properly. Milestone 4 ($80): Full installation and testing guide, ensuring everything runs smoothly. Requirements: Experience with Supabase, PostgreSQL, JavaScript/TypeScript frameworks. Ability to migrate and troubleshoot Supabase to PostgreSQL. Strong documentation skills for a clear installation guide. Timeline: 4 days If you have experience with this kind of migration, please apply with examples of similar work. Looking forward to working with you!
Skills: React, SQL, Supabase, PostgreSQL
Fixed budget:
200 USD
7 hours ago
|
|||||
Python Developer | ChatGPT Chatbot (FAISS + SQL + AWS EC2)
|
200 USD | 7 hours ago |
Client Rank
- Excellent
$5'159 total spent
22 hires
15 jobs posted
100% hire rate,
open job
5.00
of 14 reviews
|
||
We have a ChatGPT-powered chatbot API in Python that provides custom responses based on:
✅ FAQ stored in FAISS (Vector Database) ✅ Additional data stored in a remote SQL database Scope of Work: 1. Update FAQ & SQL Data – Modify and optimize stored data. 2. Improve Chatbot Performance – Validate response accuracy. 3. Deploy to AWS EC2 – Set up and publish the project for live use. Who We Need: • Experience in Python, FAISS (Vector Database), and SQL. • Familiarity with AWS EC2 deployment & best practices. • Prior experience with LLM/ChatGPT API-based chatbots is preferred. Looking for a skilled and efficient developer to complete this quickly. Apply with relevant experience!
Skills: ChatGPT, Python, MySQL, Vector Database
Fixed budget:
200 USD
7 hours ago
|
|||||
Xamarin App Optimization Expert Needed
|
15 - 30 USD
/ hr
|
7 hours ago |
Client Rank
- Good
$1'836 total spent
2 hires
1 jobs posted
100% hire rate,
open job
|
||
We are seeking an experienced developer to optimize our fully functioning application built with Xamarin, Node.js, and a SQL database. The goal is to enhance performance, streamline processes, and improve user experience. The ideal candidate should have a strong background in mobile app development, specifically with Xamarin, and be proficient in backend optimization with Node.js and SQL. If you have a proven track record of optimizing applications, we would love to hear from you.
Skills: Android, iOS, Smartphone, Xamarin, Mobile App Development
Hourly rate:
15 - 30 USD
7 hours ago
|
|||||
SQL SERver analysis services
|
10 USD | 7 hours ago |
Client Rank
- Excellent
$999 total spent
44 hires
35 jobs posted
100% hire rate,
open job
5.00
of 32 reviews
|
||
I am looking for an experienced Microsoft BI professional to help me complete three tasks
Need someone who is good at SSMS and Visual studio Part 1, Define KPIs in SSAS Multidimensional Use the existing multidimensional cube to create KPIs based on predefined business needs Implement KPI definitions using MDX in SQL Server Analysis Services (Multidimensional). Ensure the KPIs are fully functional and can be browsed successfully. Part 2: Define KPIs in SSAS Tabular Use an existing Tabular model Implement DAX measures and set up the KPI targets, thresholds, and status indicators in SQL Server Analysis Services (Tabular). Confirm the KPIs work correctly when deployed. Part 3: Build a Dashboard in SSRS Create an SSRS report or dashboard to display key insights from the data warehouse
Skills: SQL, Microsoft SQL Server, Visual Studio Team Services, Microsoft SQL Server Administration
Fixed budget:
10 USD
7 hours ago
|
|||||
Data Engineer
|
150 USD | 6 hours ago |
Client Rank
- Medium
|
||
working on Data and perform analytics based on previous data
Skills: ETL Pipeline, Apache Spark, Big Data, Python, Data Modeling, Machine Learning, Apache Hadoop, Tableau, ETL, SQL
Fixed budget:
150 USD
6 hours ago
|
|||||
Dashboard and database store on a Linux PC (not cloud) with dashboard
|
3,000 USD | 6 hours ago |
Client Rank
- Excellent
$46'125 total spent
102 hires
93 jobs posted
100% hire rate,
open job
4.64
of 66 reviews
|
||
Application running on Linux PC
1. Linux 2. data from sensors (up to 500 sensors) 3. storage of data coming from sensors 4. menu to access dashboard, user management, device management, settings 5. charts to show the data (x-axis is time) 6. threshold based alerts, 7. dashboard shows alerts, and number of sensors active, number of sensors not sending data, etc 8. Data sent to application using MQTT or HTTP (Json) 9. Scheduler to send data to sensors (e.g. turn on/off lights) 10. Landing page of application for login (username and password)
Skills: SQL, Linux, Dashboard, Python
Fixed budget:
3,000 USD
6 hours ago
|
|||||
Windows VPS Server Setup with IIS, ASP, SQL, .NET, and Exchange Mail Server
|
20 - 50 USD
/ hr
|
6 hours ago |
Client Rank
- Good
$3'254 total spent
18 hires
8 jobs posted
100% hire rate,
open job
4.75
of 9 reviews
|
||
We are looking for a skilled MCSA or equivalent professional to configure our Windows VPS server with Plesk. The job involves setting up IIS, ASP, SQL Server, .NET 9, and Exchange Mail Server to ensure optimal performance and security. The ideal candidate should possess extensive experience in server management and configuration, with a proven track record in similar projects. If you have the expertise in creating a robust hosting environment and are familiar with the latest technologies, we would love to hear from you.
Skills: Microsoft IIS, Windows Server, System Administration, Windows Administration, Microsoft SQL Server
Hourly rate:
20 - 50 USD
6 hours ago
|
|||||
Dot Net, Java & Data Engineer Needed
|
30,000 USD | 5 hours ago |
Client Rank
- Risky
|
||
We are looking for an experienced freelancer proficient in .NET, Java, and Data Engineering for ETL development using Python. The ideal candidate should have expertise in:
.NET & Java: Strong backend development skills, API development, and integration. ETL & Data Engineering: Experience with Python-based ETL pipelines, data transformation, and processing. Cloud & Databases: Familiarity with GCP, BigQuery, Teradata, SQL/NoSQL databases, and data migration. Kafka & Microservices: Experience in event-driven architecture and microservices is a plus.
Skills: SQL, API, Java, JavaScript, Microsoft SQL Server, .NET Framework, Database, ASP.NET, RESTful API, PostgreSQL, Python, C#, Node.js
Fixed budget:
30,000 USD
5 hours ago
|
|||||
LF Certified GCP Data Engineer - preferably Polish speaking
|
not specified | 5 hours ago |
Client Rank
- Excellent
$11'170 total spent
99 hires
55 jobs posted
100% hire rate,
open job
4.45
of 66 reviews
|
||
Hi!
We are looking for a Certified GCP Data Engineer to join our team! In this position you will be responsible for creating secure and automated data pipelines from various data sources to BigQuery data warehouse. Additionally you will be responsible for data automation and orchestration within the BigQuer itself. Working with tools like dbt, Dataform. Ideal candidate will have a great experience working with Google Cloud Platform and knowing most of the data related tools like Dataform, BiGUQery, DataPlex, DataFusion, DataStream,. Pub/Sub, Cloud RUn and Cloud Functions. Candidates from Poland and Ukraine will be prioritetized. Candidates with GCP Profesionall Data Engineer will be also prioritetized. This role will require a minimum of 40-60 hours per month so please do not apply if you cannot commit the bare minimum to avoid wasting yours and ours time. Please put a "GCP" at the start of your proposal.
Skills: SQL, Data Engineering, Apache Airflow, Google Cloud Platform, Python
Budget:
not specified
5 hours ago
|
|||||
React DuckDB UI
|
not specified | 5 hours ago |
Client Rank
- Excellent
$171'684 total spent
100 hires
54 jobs posted
100% hire rate,
open job
4.97
of 60 reviews
|
||
Using React (and any UI component of your choice) build a fully client-side that
1- Loads a small sample table into DuckDB WASM (table shall contain multiple types) 2- Queries the table to extract column types and statistics. For this you'll need to take UI inspiration and SQL commands from https://duckdb.org/2025/03/12/duckdb-ui.html 3- Displays the results clearly in a simple UI, such as the screenshot
Skills: React
Budget:
not specified
5 hours ago
|
|||||
App Development Using Re Tool
|
10 - 20 USD
/ hr
|
5 hours ago |
Client Rank
- Excellent
$79'270 total spent
38 hires
27 jobs posted
100% hire rate,
open job
4.98
of 26 reviews
|
||
Project Overview:
We are looking for an experienced developer to build an internal web-based application on ReTool to manage user accounts, software subscriptions, and associated devices. The app will allow us to track licenses, automate data updates, and integrate with various platforms through APIs. Key Features & Requirements: ✅ User Authentication – Secure login system for internal access. ✅ Client/User Management – Select a client and view all their associated users. ✅ Subscription Tracking – Display all active software licenses per user (e.g., Microsoft 365, Adobe, Google Workspace, Antivirus). ✅ Device Association – Track assigned devices (PC, Mac, Phone) and their associated software/licenses (e.g., Antivirus, MDM solutions). ✅ Automated Data Parsing – Ability to upload reports (e.g., Office 365 license exports) and automatically update records. ✅ REST API Integration – Connect with external platforms (Microsoft 365, Google Workspace, Adobe, etc.) to pull live data. ✅ Scalability & Security – Ensure the platform is secure, efficient, and scalable. Ideal Skills & Experience: Proficiency in ReTool Proficiency in REST API development and integrations. Experience with database management and processing structured data. Front-end and back-end development expertise (React, Angular, Vue, or similar). Strong understanding of cloud platforms (Azure, AWS, or Google Cloud). Experience working with identity and licensing APIs (Microsoft Graph API, Google Workspace API, Adobe API). Strong problem-solving skills and ability to work independently. Deliverables: Fully functional web application with a user-friendly interface. Secure authentication and role-based access control. API integrations with software platforms for real-time data updates. Ability to upload reports and automatically parse data into the system. Clear documentation for future maintenance and enhancements. Next Steps: If you have experience building similar applications, please submit your proposal with: Your experience with ReToo, API integrations and subscription management tools. Examples of similar projects you’ve worked on. We’re looking for a reliable, detail-oriented developer to build a robust and scalable solution. Looking forward to your proposals!
Skills: API Integration, JavaScript, Java, PHP, SQL, Web Application, API
Hourly rate:
10 - 20 USD
5 hours ago
|
|||||
Set up Vertex AI Search in GCP
|
150 USD | 5 hours ago |
Client Rank
- Good
$2'040 total spent
12 hires
12 jobs posted
100% hire rate,
open job
4.15
of 6 reviews
|
||
I have an Application in which we want to implement an AI search. To do this, we need to set up the Vertex AI Search using the existing Postgres database. This is a straightforward task for anyone who knows Vertex AI and GCP with Cloud SQL.
Skills: Google Cloud Platform, Python, Artificial Intelligence, API
Fixed budget:
150 USD
5 hours ago
|
|||||
Full Stack Developer Needed for QuickBooks Integration
|
1,000 USD | 4 hours ago |
Client Rank
- Excellent
$12'174 total spent
19 hires
11 jobs posted
100% hire rate,
open job
4.49
of 9 reviews
|
||
We are seeking an experienced IT full stack developer to assist us in integrating QuickBooks Intuit accounting software into our existing systems, including our CRM, TMS, FMS, and WMS applications. The ideal candidate will have a strong background in software development and a deep understanding of both front-end and back-end technologies. You will work closely with our team to ensure seamless integration and optimization of our systems. If you have a passion for efficient software solutions and a proven track record in similar projects, we would love to hear from you.
About Us: Shypv is a leading provider of integrated warehouse management, transportation management, fleet management, and customer relationship management solutions. Our innovative software solutions help businesses efficiently manage their operations. We're seeking a talented Full Stack Developer who specializes in integrating systems like Intuit QuickBooks www.quickbooks.intuit.com into our suite of in-house software solutions. Position Overview: As a Full Stack IT Developer with expertise in QuickBooks integration, you will play a crucial role in developing seamless connections between QuickBooks and our internal WMS, TMS, FMS, and CRM systems. You'll work with our talented tech team to enhance functionality and improve user experience. Key Responsibilities: - Integration Development: Design, develop, and implement integrations for QuickBooks with our existing WMS, TMS, FMS, and CRM software. - System Architecture: Collaborate on the design of system architecture to support new integrations. - Testing & Deployment: Conduct rigorous testing to ensure seamless integration without disrupting existing operations, followed by deployment and monitoring. - Cross-Functional Collaboration: Engage with cross-functional teams to understand user requirements, ensuring solutions align with business goals. - Documentation: Create and maintain detailed technical documentation for integration processes and configurations. - Troubleshooting: Identify and resolve integration issues quickly to minimize downtime and ensure data integrity. Requirements: - Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. - Experience: Proven experience in full-stack development and, specifically, integrating QuickBooks with other software solutions. - Technical Skills: - Proficient in languages such as JavaScript, Python, or Java. - Experience with APIs, web services, and RESTful architecture. - Familiarity with database systems like MySQL, MongoDB, or SQL Server. - Knowledge of cloud services and infrastructure is a plus. - Soft Skills: Strong problem-solving skills, excellent communication abilities, and a team player attitude. Preferred Qualifications: - Experience working with WMS, TMS, FMS, or CRM systems. - Prior experience in logistics or accounting industries. - QuickBooks certification or similar credentials
Skills: API, API Integration, JavaScript, PHP, Python
Fixed budget:
1,000 USD
4 hours ago
|
|||||
N- 4 Full-time Laravel + Vue Developers Opportunity. Long-term Opportunities
|
5 USD | 4 hours ago |
Client Rank
- Excellent
$1'241 total spent
78 hires
52 jobs posted
100% hire rate,
open job
5.00
of 39 reviews
|
||
Laravel Full-time Developers to work on our projects (software) on a "fixed-price" basis.
Please send your expected monthly salary if working on a monthly salary basis. The lowest, but experienced offer will have a better chance. Monthly salary will be considered to calculate the hourly rate and the time needed to complete different task documents, on an agreed fixed price. The best offers with the right experience will have the opportunity to work with us on a long-term basis. Also, Good prospects for the right developers, based on the performances evaluated over a period of time, once you established your skills. Need to provide the changed code details and sql queries if any at the end of each sub-task, with the changed code line numbers. Unlimited Revisions are needed. Any bugs created due to your own developments are to be fixed on free of Charge. Recruitment is for a long-term big project as such, all the developers are subject to practical skill task tests for this payments will not be done. Please send your details for the below-mentioned questions. 1. What type of Software you have done in Laravel+Vue? 2. How many years of Industry experience you have in Laravel software development? 3. How many years of Industry experience you have in Vue software development? 4. How many years of Industry experience you have in Laravel + Vue software development? 5. What are your other skills and capabilities? 6. Are you employed anywhere now and if as in which capacity and responsibilities? 7. What is your lowest expected monthly salary in USD? 8. What is your No of working days per week? 9. What is your No of working hours per day? 10. Are you ready to work on the project wise (based above hourly rate)?
Skills: Node.js, Laravel, MySQL, PHP, Vue.js
Fixed budget:
5 USD
4 hours ago
|
|||||
Setup postgres pgvector and pgai
|
10 USD | 4 hours ago |
Client Rank
- Excellent
$11'168 total spent
195 hires
135 jobs posted
100% hire rate,
open job
4.90
of 123 reviews
|
||
I am looking for a programmer to help me install and configure posrgres pgai
sample : https://github.com/timescale/pgai?tab=readme-ov-file only those who have knows how to do
Skills: PostgreSQL, PostgreSQL Programming, SQL, Linux, Python
Fixed budget:
10 USD
4 hours ago
|
|||||
Business Analytics & SAP BW Consultant (Healthcare Supply Chain)
|
253 USD | 3 hours ago |
Client Rank
- Medium
$140 total spent
10 hires
1 jobs posted
100% hire rate,
open job
|
||
We are looking for a candidate having 5 years of experience in below technologies
Techstack : Business Analytics , Healthcare supply chain, business warehouse tools like SAP Duration : 2 hrs/day Budget : 22k/month Time : Will fix Working Type : Screen Sharing Work Via Zoom call.
Skills: SAP, Business Analysis, Business Intelligence, SAP Business Warehouse, SQL
Fixed budget:
253 USD
3 hours ago
|
|||||
CRUD API Backend Development with User Authentication
|
10 USD | 3 hours ago |
Client Rank
- Risky
|
||
I am looking for a developer to create a CRUD API backend with user authentication for my project. The API should allow for the following features:
Features Required: CRUD Operations: Create, Read, Update, and Delete users and other relevant data. Authentication: Secure user registration, login, and logout. JWT or OAuth 2.0 token-based authentication. Password hashing and user session management. Roles and Permissions: Role-based access control for different users (e.g., admin and regular users). Database: Connect to a NoSQL (MongoDB) or SQL database (MySQL, PostgreSQL, etc.). Error Handling & Validation: Proper error handling and input validation for each API endpoint. Technologies: Preferred stack: Node.js (Express) or Python (Django/Flask) for the backend. Database: MongoDB or SQL-based (open to recommendations). Requirements: Good understanding of REST API principles. Experience with user authentication and authorization. Ability to provide clean and well-documented code. Clear communication and ability to work with feedback. Please include: Examples of similar work (if available). Your proposed timeline and budget estimate. Any questions you might have. Looking forward to your proposals!
Skills: API, RESTful API, JavaScript, Node.js, MongoDB, HTTP, ExpressJS
Fixed budget:
10 USD
3 hours ago
|
|||||
Data Scientist Needed for Insights and Analysis
|
15 - 50 USD
/ hr
|
2 hours ago |
Client Rank
- Medium
|
||
We are seeking a skilled Data Scientist to join our team to analyze data and provide actionable insights. The ideal candidate will have experience in statistical analysis, machine learning, and data visualization. You will work closely with our data team to develop models that drive business decisions. If you are passionate about data and have a proven track record in this field, we would love to hear from you!
Relevant Skills: - Statistical Analysis - Machine Learning - Data Mining - Data Visualization (e.g., Tableau, Power BI) - Python/R Programming - SQL - Big Data Technologies (Hadoop, Spark) - Strong Communication Skills
Skills: Data Analysis, Statistics, Data Science, Python, R
Hourly rate:
15 - 50 USD
2 hours ago
|
|||||
Senior Python Developer (FastAPI, AWS)
|
600 USD | 2 hours ago |
Client Rank
- Excellent
$42'085 total spent
45 hires
37 jobs posted
100% hire rate,
open job
4.73
of 34 reviews
|
||
Requirements:
5+ years of Python development experience Expertise in FastAPI and building high-performance APIs Strong knowledge of AWS services (Lambda, S3, RDS, etc.) Experience with Docker, CI/CD, and cloud deployments Proficient in SQL & NoSQL databases Strong problem-solving and optimization skills Responsibilities: Develop, optimize, and maintain FastAPI-based microservices Deploy and manage applications on AWS Ensure scalability, security, and performance Collaborate with teams for seamless integration
Skills: Amazon Web Services, Python, Amazon S3, Python Script, RESTful API, Amazon EC2, AWS Lambda, API
Fixed budget:
600 USD
2 hours ago
|
|||||
Creating a performance ranking from industry data
|
not specified | 2 hours ago |
Client Rank
- Medium
$130 total spent
2 hires
2 jobs posted
100% hire rate,
open job
GBR
|
||
I have input data. relating to firm rankings in various business areas, salary data and prestige information in Postgres. We have ingested 200k employees into this database and now want to rank them all individually based on their profiles. To do this we need to look at the rankings for each firm within each business area as well as giving a score to individuals.
Skills: SQL, Python, Statistics, Data Analytics
Budget:
not specified
2 hours ago
|
|||||
Web Scraping & Automation Expert for Legal Rulings & Legislation Database
|
200 USD | 2 hours ago |
Client Rank
- Medium
|
||
We are looking for an expert in web scraping, automation, and AI-driven classification to develop a system that extracts, structures, and continuously updates two separate legal databases:
1. A database of judicial rulings (case law) 2. A database of laws and regulations (legislation) Both databases must be automatically updated whenever new rulings or laws are published and must be searchable using an AI-powered retrieval system. 📌 The name of the target website will be provided in private. ⸻ 🔹 Project Scope & Goals: 1️⃣ Web Scraping & Data Extraction 📌 Judicial Rulings Database (Case Law Extraction) • Extract all available rulings from the target website. • Collect judgments from each judicial authority: • Tribunale (Court of First Instance) • Corte d’Appello (Court of Appeal) • Corte di Cassazione (Supreme Court of Cassation) • Giudice di Pace (Justice of the Peace) • Classify judgments by subject matter (e.g., labor law, civil liability, administrative law). • Store the extracted judgments in a structured relational database (PostgreSQL, MySQL, or another suitable solution). 📌 Legislation Database (Normattiva.it Extraction) • Extract all laws and regulations from Normattiva, Italy’s official legal archive. • Structure the extracted legislation into a separate, organized database. • Ensure automatic updates whenever new laws or amendments are published. ⸻ 2️⃣ Database Architecture & AI Integration 📌 Two Fully Automated Databases • Case Law Database: Stores extracted judicial rulings. • Legislation Database: Stores extracted laws and regulations. • Each database must update automatically whenever new rulings or legal texts are available. 📌 AI-Based Classification & Search • Implement an LLM (Large Language Model) to: • Classify judgments by judicial authority and subject matter (e.g., labor law, tax law). • Extract key insights from legislation (summarization and tagging). • Enable AI-powered semantic search, allowing legal professionals to search using natural language queries. 📌 Automated Database Updates • The system must periodically check for updates and integrate new rulings and legislative amendments. • Ensure robust error handling and duplicate prevention. ⸻ 3️⃣ Export, API, & Data Accessibility 📌 Export Options • Data should be exportable in CSV, JSON, and SQL formats. • Provide an API to allow external systems to query both databases. 📌 Web-Based Query System (Optional Feature) • If feasible, provide a basic web interface where users can search both case law and legislation databases. ⸻ 🔹 Key Technical Considerations: ✅ Anti-Scraping Protections • Does the target website have anti-scraping mechanisms (CAPTCHAs, IP bans, JavaScript rendering)? • What solutions do you propose for bypassing protections (e.g., Selenium, Playwright, rotating proxies, headless browsers)? ✅ Automation & Efficiency • What workflow do you suggest to ensure smooth automatic updates? • How would you handle error handling, rate limits, and optimization? ✅ Infrastructure & Scaling • Preferred Database: PostgreSQL or MySQL (flexible based on your expertise). • Server Setup: Would AWS, Google Cloud, or another solution be preferable? ⸻ 🔹 Ideal Candidate ✔ Proven experience in web scraping & automation. ✔ Expertise with Scrapy, BeautifulSoup, Selenium, or Playwright. ✔ Strong database knowledge (PostgreSQL, MySQL, or NoSQL solutions). ✔ Familiarity with LLM-based AI classification & retrieval (GPT, Mistral, DeepSeek, etc.). If you are interested and available, please send: ✅ A brief overview of your experience in similar projects. ✅ A proposed technical approach for scraping, data storage, and automation. ✅ An estimated cost and timeline for completion. Looking forward to collaborating! Thank you, G.F.
Skills: API Integration, Selenium, SQL, Data Scraping, Python, Data Mining, Data Extraction
Fixed budget:
200 USD
2 hours ago
|
|||||
Web Security Audit
|
300 USD | 1 hour ago |
Client Rank
- Excellent
$55'806 total spent
54 hires
46 jobs posted
100% hire rate,
open job
3.79
of 36 reviews
|
||
We're looking an experienced security professional to conduct a comprehensive penetration testing audit of our web application. The ideal candidate will:
- Identify vulnerabilities, security flaws, and potential attack vectors - Perform thorough testing including OWASP Top 10 vulnerabilities - Write wrs in cover letter the start of your - Provide detailed documentation of findings with severity ratings - Deliver actionable remediation recommendations with implementation guidance - Work collaboratively with our development team to verify fixes Required qualifications: - Proven experience in web application security testing - Strong knowledge of common web vulnerabilities (SQL injection, XSS, CSRF, etc.) - Familiarity with modern security frameworks and testing methodologies - Excellent communication skills for technical and non-technical stakeholders - Relevant security certifications (OSCP, CEH, or equivalent) This is a project-based role with potential for ongoing security partnership. Please include examples of previous penetration testing experience in your application.
Skills: Penetration Testing, Vulnerability Assessment, Network Security, Information Security
Fixed budget:
300 USD
1 hour ago
|
|||||
Postgres to Snowflake Data Migration Specialist
|
5 - 10 USD
/ hr
|
46 minutes ago |
Client Rank
- Risky
|
||
We are seeking an experienced data engineer to build a robust data pipeline for migrating our database from Postgres to Snowflake. The ideal candidate will have a strong understanding of both database technologies and expertise in data migration processes. Responsibilities include assessing existing data structures, designing the pipeline architecture, and ensuring data integrity throughout the migration. If you have a proven track record in similar projects, we would love to hear from you!
Skills: PostgreSQL, SQL, ETL Pipeline, PostgreSQL Programming, ETL
Hourly rate:
5 - 10 USD
46 minutes ago
|
|||||
Tableau Expert
|
5 - 30 USD
/ hr
|
33 minutes ago |
Client Rank
- Risky
IND
|
||
We are seeking an experienced Tableau professional to assist with [specific project details]. The ideal candidate will have a strong background in data visualization, reporting, and storytelling through Tableau.
Skills: Python, R, Tableau, Data Visualization, SQL, Microsoft Power BI, Salesforce, Google Analytics, Analytics
Hourly rate:
5 - 30 USD
33 minutes ago
|
Streamline your Upwork workflow and boost your earnings with our smart job search and filtering tools. Find better clients and land more contracts.