Convoy v24.4.1 is out 🎉 Lots of features I'm excited about - You can now stream, transform, and send webhooks directly from our Message Broker sources (Kafka, etc.). Previously, you needed to transform the event first before publishing it on the topic Convoy reads from. Now, you can plug Convoy directly into your CDC pipeline! h/t to Matt Ober for being beta testers! - A community member ( Nitzan Goldfeder) contributed a RabbitMQ source! This is exactly why we like open-source! - We re-implemented our rate limiter to work on PostgreSQL (this is part of our continuous effort to phase out Redis, Long-Live PostgreSQL) And a lot more bug fixes. :) Link to the full post in the comments!
Convoy v24.4.1 releases with new features
More Relevant Posts
-
🚀 Deploy a Kafka 3 nodes Cluster in few minutes on OVHcloud multi-regions In this example, we'll deploy the Kafka in cluster Mode (KRaft - Apache Kafka Without ZooKeeper). Here we'll look at how to deploy 4 services with a single YAML file: 👉 3 controllers and brokers 👉 1 kafka-ui (web interface) 🔗 YAML file to deploy in 5 minutes and others ressources: https://lnkd.in/ets6Gcf8
To view or add a comment, sign in
-
Pavlo Golub: vip-manager v2.8 meets Patroni REST API 🚀 Exciting News for PostgreSQL Enthusiasts! 🚀 If your Saturday nights are spent tinkering with virtual IPs in PostgreSQL HA setups, then Pavlo Golub has a treat for you—vip-manager v2.8.0 is here! 🎉 Gone are the days of complex configurations
To view or add a comment, sign in
-
Why I decided to build a product around databases? There was a time when my team and I managed around 200 database servers with MySQL and MariaDB. The pain point was that only experienced engineers could tune the database servers, but it took a lot of time, and even they made human errors. That's why I decided to simplify the tuning process and save their time. I tried to find the application that could tune MySQL for best performance, preferably in the fully automatic mode. My search did not return a solution that could be called good, let alone perfect. I decided to build it. The first version takes not much time. It was a simple rule-based cloud platform combined with a Bash script. This script sent a MySQLTuner JSON report to the platform and received a configuration file in return. This version saved our team dozens of human hours monthly, allowing engineers to focus on other tasks. But that was just the first step... Any kind of support for our Product Hunt launch is much appreciated ❤️
Founder at Releem - MySQL Performance Monitoring and Tuning Tool | DevOps/SRE engineer | Entrepreneur
Releem is now live on ProductHunt!🤞 https://lnkd.in/eChr38Dd It was started as a side project 4 years ago and now it helped manage and tune MySQL on thousands of db servers for over 4000 users. Any kind of support is much appreciated ❤️ Thank you!
To view or add a comment, sign in
-
For production systems this is helpful if you need to change the number of partitions or reset the offset for a consumer group.
If you need quick to kafka (check topic, produce test message, etc.) you can use https://lnkd.in/eM8YAYQH docker and override entrypoint. So just run docker run -it --entrypoint=/bin/bash wurstmeister/kafka:latest -i And you will have access to Kafka command line tools kafka-topics.sh --create --topic dino --bootstrap-server broker:9092 kafka-console-producer.sh --topic dino --bootstrap-server broker:9092 Or just be a normal person and use apps like https://lnkd.in/eQFFQhb6
hub.docker.com
To view or add a comment, sign in
-
🛳 🎉 Shipping continues at full throttle. Here is our latest release, version 0.15.1: https://lnkd.in/gCf9rf5e. This release focuses on Security, Stability, and Usability. Goal was take PeerDB multiple steps further in providing an enterprise-grade CDC experience for Postgres A few highlights include: 🔐 Encrypting credentials stored in the peers table. 🚫 Option to disable metadata columns, enabling online migrations. 🚀 Faster Avro conversion. 📊 Better data-type handling. ⚡ Configurable settings page to tune and manage MIRRORs.
To view or add a comment, sign in
-
Effortless Kafka TLS Setup: Recently I was on a task to develop an in-class demonstration to set up Apache Kafka TLS. Kafka TLS helps you use a mutual TLS communication between Kafka brokers and Clients viz. Producers and consumers using SSL encryption. The online material is confusing at first look. Probably because they were designed for different setups 🙂. So I took the initiative to build something more usable. Here are the key steps. 0. Download OPENSSL (Protip: Use a windows installer). 1.CA creation 2.Server & client keystore and truststore generation 3.Certificate signing requests 4.Certificate signing 5.Certificate imports 6.Configuring servers : Edit the server.properties file. 7.Configuring producers and consumers: Create a client.properties file. 8. Run server with the server.properties 9. Run producers with the --producer.config parameter. 10. Run consumers with the --consumer.config parameter Here is the link to a simple github repo to automate the process: https://lnkd.in/g8NnkJqW The setup.bat file automates steps 1 to 5. Check the config folder for example client and server properties file. This simplifies the often complex task of securing your Kafka cluster significantly. It's all about ease of use without compromising on security. Give it a try! #Kafka #TLS #Automation #Security #ApacheKafka #DevOps #DataEngineering #EasySetup #BatchScript
GitHub - devmukherjee/kafka_tls: Automate your kafka tls setup with batch files.
github.com
To view or add a comment, sign in
-
🚀 Big news from the Apache Hop community! After 4 months of hard work on 139 tickets, Apache Hop 2.10.0 is here! 🎉 This is the biggest release yet, packed with new features, bug fixes, and improvements. https://lnkd.in/e9E37aAF 🔧 Key updates: - Better MacOS UI: Fixed dialog spacing issues. - Java 17 upgrade: Hop is now built on Java 17. - Azure support: Configure multiple Azure storage accounts. - File Explorer upgrades: Recognizes more file types with enhanced features. - Enhanced Git integration: Manage your version control easily within the GUI. 📈 The Apache Hop community is growing too: 867 chat members 2,262 LinkedIn followers 1,220 YouTube subscribers Check out the release and see how Apache Hop 2.10.0 can boost your data orchestration! 💬 Reach out if you want to learn more or need help with your data platform. #apachehop #dataintegration #dataplatform #dataengineering
To view or add a comment, sign in
-
🎉 Go SDK for Pinecone is now generally available. You get: - Access to the full database API, including new index deletion protection - Expanded configuration for gRPC implementations - Improved development experience, including more detailed error responses and improved JSON annotations Start building today: https://hubs.ly/Q02Kj3ZY0
To view or add a comment, sign in
-
Just want to quickly share that i have completed a project which let the user to save their personal notes to the server with the proper user authentication system so no other can access other's private notes. There are many things that i have learnt during this project, including some database things and most part of the backend also how to create your own API using the backend and interacting via various types of network requests. So if you like to give it a try, link to the GitHub repo is below: https://lnkd.in/gFBxjKuw
To view or add a comment, sign in
-
#ApacheKafka 3.8 is here, and features 17 new KIPs: ● 13 related to Kafka Core and clients ● 3 for Kafka Streams ● 1 for Kafka Connect In this new video, Danica Fine provides a summary of the latest updates and new features in this Apache Kafka release! Watch it here ➡️ https://lnkd.in/eTbDZi5Q
To view or add a comment, sign in
Co-Founder & CEO at Convoy | Building a Webhooks Gateway
9moFull post: https://getconvoy.io/blog/convoy-v24.4.1