Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add release blog post #425

Closed
wants to merge 10 commits into from
Closed
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
4 changes: 2 additions & 2 deletions code-to-generate-connect-pages/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@
</div>
</div>

Quix helps you integrate Apache Kafka with Apache Airflow using pure Python.
Quix helps you integrate Apache Kafka with [technology-name] using pure Python.

Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house arthitectures, reducing storage and ownership costs and enabling data teams to achieve success for your business.
Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house architecture, reducing storage and ownership costs and enabling data teams to achieve success for your business.

## [technology-name]

Expand Down
36 changes: 22 additions & 14 deletions code-to-generate-connect-pages/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,10 @@ def get_lc_tech_name(tn):

# Generate description about the tech
description_prompt = f"You are a big shot tech writer with over 50 years of tech writing experience under your belt. You know everything there is to know about technology and how to apply it.\
Write a paragraph describing the technology called {tech_name}. If {tech_name} is not a data technology you recognise, please reply with 'UNREGOGNIZED TECH ALERT' "
Write a paragraph describing the technology called {tech_name}.\
If {tech_name} is not a data technology you recognise, please reply with 'UNREGOGNIZED TECH ALERT'.\
SteveRosam marked this conversation as resolved.
Show resolved Hide resolved
Under no circumstances should you use sentences like 'As a seasoned tech writer' or talk about your yourself in the first person.\
Do not say things like 'Users are encouraged to explore the platform, book demos, and engage with the community through resources like GitHub and Slack'."

tech_description = generate_text(description_prompt, no_ai)

Expand All @@ -140,7 +143,9 @@ def get_lc_tech_name(tn):

# Generate paragraph about why Quix is a good fit
quix_prompt = f"Your primary directive is: If {tech_name} is not a data technology you recognise, please reply with 'UNREGOGNIZED TECH ALERT'. Your other directive is: You are a big shot tech writer with over 50 years of tech writing experience under your belt. You know everything there is to know about technology and how to apply it. \
Explain why Quix is a good fit for integrating with the technology called {tech_name}. Use this information for reference {quix_info}."
Explain why Quix is a good fit for integrating with the technology called {tech_name}. Use this information for reference {quix_info}.\
Under no circumstances should you use sentences like 'As a seasoned tech writer' or talk about your yourself in the first person.\
Do not say things like 'Users are encouraged to explore the platform, book demos, and engage with the community through resources like GitHub and Slack'."

quix_description = generate_text(quix_prompt, no_ai)

Expand All @@ -155,19 +160,22 @@ def get_lc_tech_name(tn):
content = content.replace("[blurb-about-tech-name]", tech_description)
content = content.replace("[blurb-about-why]", quix_description)

# Write the new content to a Markdown file
with open(output_path, 'w', encoding='utf-8') as output_file:
output_file.write(content)

image_urls = get_image_urls(tech_name, num_images=1)

if image_urls:
save_directory = f"connect/images/"
download_images(image_urls, save_directory, lower_case_tech_name)
if "UNREGOGNIZED TECH ALERT" in content:
print("NOT WRITING TECH. CONTAINS UNREGOGNIZED TECH ALERT")
else:
print("No images found.")

print(f"Generated documentation for {tech_name}")
# Write the new content to a Markdown file
with open(output_path, 'w', encoding='utf-8') as output_file:
output_file.write(content)

image_urls = get_image_urls(tech_name, num_images=1)

if image_urls:
save_directory = f"connect/images/"
download_images(image_urls, save_directory, lower_case_tech_name)
else:
print("No images found.")

print(f"Generated documentation for {tech_name}")



Expand Down
4 changes: 2 additions & 2 deletions code-to-generate-connect-pages/print_urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@

# Path to the directory containing the files
directory_path = '../docs/connect/'
url_base = "https://quixdocsdev.blob.core.windows.net/pr423/connect/"
url_base = "https://quix.io/docs/connect"

# Loop through each file in the directory
for filename in os.listdir(directory_path):
# Print the file name without the extension
print(f'{url_base}/kafka-to-{filename}.html')
print(f'{url_base}/{filename.replace(".md", "")}.html')
Binary file added docs/blog/posts/images/diagram-iceberg-sink.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
44 changes: 44 additions & 0 deletions docs/blog/posts/release-scratchpads.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
title: "Quix Release: Scratchpads"
date: 2024-10-28
authors: [steve-rosam]
slug: quix-release-scratchpads
description: >
Learn about the latest Quix release.
categories:
- releases
hide:
- navigation
---

New features, bug fixes and performance upgrades!

<!-- more -->

## New features

- **Scratchpads:** Enables shared topics between environments, setting resources only in focused steps of the pipeline and allowing code modifications to be easily merged back into Production.
- **Data tiers:** this feature allows users to assign a **`Bronze, Silver, or Gold`** classification to their data - or define their own tiers for each topic, reflecting its data quality and level of pre-processing.
- **Quix CLI** 1.1.0 adds support for YAML variables on local development. [More info in docs](https://quix.io/docs/quix-cli/local-development/local-yaml-variables.html).
SteveRosam marked this conversation as resolved.
Show resolved Hide resolved

## Enhancements

- We have enabled replicas configuration for Jobs. Users can now set the replica count for deployments of type `Job`, enhancing job concurrency control.
- Added Support for separate private and public Library repositories. This feature allows dedicated clusters to configure separate repositories for private and public template items in the Library.
- Improved error descriptions when dealing with YAML and missing secret keys.
- Improved network configuration validation.
- Enhanced the readability of error messages in historical logs to make them more user-friendly.
- Optimized the 'Live Logs' download for improved performance.

## Bug Fixes

- Fixed a bug that prevented applications being run in the online IDE from stopping in some conditions.
- Fixed a bug that caused deployment statuses to refresh incorrectly after a runtime error occurred.
- Vulnerability fixes and patches.

## Find Out More
If you want to find out more or have any questions at all please get in touch.

<div class="" markdown>
<span>You can join our Slack community <a href="https://quix.io/slack-invite?_ga=join-from-docs-release-blog">here</a> or <a href="mailto:support@quix.io">send us an email</a>.</span>
</div>
Binary file modified docs/connect/images/apache-airflow_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/apache-camel_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/apache-curator_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/connect/images/apache-hive_1.jpg
Binary file not shown.
Binary file removed docs/connect/images/apache-ignite_1.jpg
Binary file not shown.
Binary file modified docs/connect/images/apache-kafka_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/apache-pulsar_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/apache-tika_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-amplify_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-appsync_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-cloudtrail_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-codebuild_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-glue_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-iam_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-redshift_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-route-53_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-secrets-manager_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-security-hub_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-simple-notification-service-(sns-_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-step-functions_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-systems-manager_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/aws-vpc_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/cloudflare_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/domo_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/connect/images/flask_1.jpg
Binary file removed docs/connect/images/github_1.jpg
Diff not rendered.
Binary file removed docs/connect/images/gitlab_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/google-drive_1.jpg
Binary file modified docs/connect/images/h2o-ai_1.jpg
Binary file modified docs/connect/images/ibm-db2_1.jpg
Binary file removed docs/connect/images/kafka_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/linkedin_1.jpg
Binary file modified docs/connect/images/luigi_1.jpg
Binary file modified docs/connect/images/mailchimp_1.jpg
Binary file modified docs/connect/images/medium_1.jpg
Binary file modified docs/connect/images/microsoft-onedrive_1.jpg
Binary file modified docs/connect/images/notion_1.jpg
Binary file modified docs/connect/images/numpy_1.jpg
Binary file modified docs/connect/images/prefect_1.jpg
Binary file removed docs/connect/images/prometheus_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/pytorch_1.jpg
Binary file modified docs/connect/images/salesforce_1.jpg
Binary file modified docs/connect/images/sas_1.jpg
Binary file removed docs/connect/images/scala_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/seaborn_1.jpg
Binary file modified docs/connect/images/shopify_1.jpg
Binary file removed docs/connect/images/spring-boot_1.jpg
Diff not rendered.
Binary file removed docs/connect/images/squarespace_1.jpg
Diff not rendered.
Binary file removed docs/connect/images/stitch_1.jpg
Diff not rendered.
Binary file removed docs/connect/images/travis-ci_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/vertica_1.jpg
Binary file removed docs/connect/images/vimeo_1.jpg
Diff not rendered.
Binary file modified docs/connect/images/wordpress_1.jpg
Binary file modified docs/connect/images/wrike_1.jpg
Binary file modified docs/connect/images/zoho_1.jpg
Binary file modified docs/connect/images/zoom_1.jpg
20 changes: 6 additions & 14 deletions docs/connect/kafka-to-apache-airflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,11 @@

Quix helps you integrate Apache Kafka with Apache Airflow using pure Python.

Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house arthitectures, reducing storage and ownership costs and enabling data teams to achieve success for your business.
Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house architecture, reducing storage and ownership costs and enabling data teams to achieve success for your business.

## Apache Airflow

Apache Airflow is a powerful open-source tool used for orchestrating complex data workflows. It allows users to easily schedule and monitor workflows, ensuring efficient data processing and analysis. With a user-friendly interface and robust set of features, Apache Airflow has become a staple in the data engineering world. Its ability to automate and streamline workflow management makes it an essential tool for any data-driven organization.
Apache Airflow is an open-source platform used for orchestrating complex computational workflows and data processing pipelines. It allows users to schedule and monitor workflows, set up dependencies between tasks, and manage the execution of these tasks across a distributed system. Apache Airflow provides a rich set of tools and features for managing workflows, such as a web-based interface for visualizing workflow execution, a command-line interface for interacting with the system, and a metadata database for storing information about workflow runs. With Apache Airflow, users can efficiently automate and streamline their data processing tasks, improving productivity and reliability in their data pipelines.

## Integrations

Expand All @@ -31,19 +31,11 @@ Apache Airflow is a powerful open-source tool used for orchestrating complex dat
</div>


Quix is an excellent fit for integrating with Apache Airflow due to its various features that align with the needs of data engineers working with Apache Airflow.
Quix is a well-suited solution for integrating with Apache Airflow due to its ability to enable data engineers to pre-process and transform data from various sources before loading it into a specific data format. This simplifies the lakehouse architecture with customizable connectors for different destinations, ensuring a seamless integration process.

Firstly, Quix allows data engineers to pre-process and transform data from multiple sources before loading it into a specific data format, making it easier to work with lakehouse architecture. This customizable approach is crucial for compatibility with Apache Airflow's workflows.
Additionally, Quix Streams, an open-source Python library, supports the transformation of data using streaming DataFrames, allowing for operations such as aggregation, filtering, and merging during the transformation process. This facilitates efficient data handling and ensures that data is processed effectively from source to destination.

Secondly, Quix Streams, an open-source Python library, enables data transformation using streaming DataFrames, supporting essential operations like aggregation, filtering, and merging. This capability aligns well with Apache Airflow's focus on data processing and orchestration.
Furthermore, Quix offers the capability to sink transformed data to cloud storage in a specific format, promoting storage efficiency at the destination and enabling a cost-effective solution for managing data throughout the integration process. This lower total cost of ownership makes Quix an attractive option for organizations looking to streamline their data integration operations.

Moreover, Quix ensures efficient data handling from source to destination, with features like no throughput limits, automatic backpressure management, and checkpointing. These qualities enhance the reliability and scalability of data pipelines integrated with Apache Airflow.

Additionally, Quix supports sinking transformed data to cloud storage in a specific format, ensuring seamless integration and storage efficiency, which is crucial for a modern data architecture.

Furthermore, Quix offers a cost-effective solution for managing data throughout the entire pipeline, leading to a lower total cost of ownership compared to other alternatives. This cost efficiency is essential for organizations looking to optimize their data infrastructure.

Lastly, Quix encourages users to explore the platform, engage with the community, and access resources like GitHub and Slack. This emphasis on community involvement enhances users' understanding of data integration and fosters collaboration, which is beneficial for teams working with Apache Airflow.

Overall, Quix's capabilities make it a strong contender for integration with Apache Airflow, providing data engineers with the tools they need to efficiently manage data processing and transform workflows.
Overall, the features provided by Quix, such as efficient data handling, customizable connectors, and cost-effectiveness, make it a great fit for integrating with Apache Airflow and enhancing data integration processes from source to destination.

20 changes: 6 additions & 14 deletions docs/connect/kafka-to-apache-ambari.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@
</div>
</div>

Quix helps you integrate Apache Kafka with Apache Airflow using pure Python.
Quix helps you integrate Apache Kafka with Apache Ambari using pure Python.

Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house arthitectures, reducing storage and ownership costs and enabling data teams to achieve success for your business.
Transform and pre-process data, with the new alternative to Confluent Kafka Connect, before loading it into a specific format, simplifying data lake house architecture, reducing storage and ownership costs and enabling data teams to achieve success for your business.

## Apache Ambari

UNRECOGNIZED TECH ALERT
Apache Ambari is an open-source technology designed to make managing, monitoring, and provisioning Hadoop clusters easier. With Apache Ambari, users can easily deploy, manage, and monitor Hadoop clusters through a user-friendly web interface. This technology provides a centralized platform for administrators to streamline the management of their big data infrastructure, allowing them to quickly and efficiently configure and monitor their clusters without the need for extensive manual intervention. Additionally, Apache Ambari offers advanced monitoring capabilities, enabling users to track the performance and health of their clusters in real-time, making it an invaluable tool for organizations utilizing Hadoop for their big data needs.

## Integrations

Expand All @@ -31,17 +31,9 @@ UNRECOGNIZED TECH ALERT
</div>


As a seasoned tech writer with vast experience, I can confidently say that Quix is indeed a great fit for integrating with Apache Ambari.
Quix is a highly suitable platform for integrating with Apache Ambari due to its unique capabilities. Quix allows data engineers to pre-process and transform data from various sources before loading it into a specific data format, which simplifies the lakehouse architecture. This is achieved through customizable connectors for different destinations, enabling users to integrate their data in a way that best suits their needs. Additionally, Quix Streams, an open-source Python library, facilitates the transformation of data using streaming DataFrames. This allows for operations such as aggregation, filtering, and merging during the transformation process, providing flexibility and efficiency.

Quix offers data engineers the flexibility to pre-process and transform data from various sources before loading it into a specific data format. This aligns perfectly with Apache Ambari's capabilities, as it simplifies the lakehouse architecture with customizable connectors for different destinations.
Furthermore, Quix ensures efficient handling of data from source to destination with features such as no throughput limits, automatic backpressure management, and checkpointing. This helps in optimizing data processing and ensuring smooth data flow throughout the integration process. Additionally, Quix supports sinking transformed data to cloud storage in a specific format, ensuring seamless integration and storage efficiency at the destination. This feature is particularly beneficial for organizations looking to leverage cloud storage for their data needs.

Furthermore, Quix Streams, an open-source Python library, enables the transformation of data using streaming DataFrames, allowing for operations like aggregation, filtering, and merging during the transformation process. This feature complements Apache Ambari's data processing capabilities seamlessly.

Additionally, Quix ensures efficient data handling from source to destination with features like no throughput limits, automatic backpressure management, and checkpointing. This efficiency is crucial for integrating smoothly with Apache Ambari.

Moreover, Quix supports sinking transformed data to cloud storage in a specific format, ensuring seamless integration and storage efficiency at the destination. This capability is valuable for users who rely on Apache Ambari for cloud data management.

Overall, Quix offers a cost-effective solution for managing data from source through transformation to destination, which can help reduce the total cost of ownership for users compared to other alternatives.

In conclusion, with Quix's comprehensive features and seamless integration capabilities, it is a perfect match for integrating with Apache Ambari, providing users with a powerful tool for data management and transformation.
Overall, Quix offers a cost-effective solution for managing data from source through transformation to destination, making it a compelling choice for integrating with Apache Ambari. Its unique capabilities and efficiencies make it a valuable tool for data engineers and organizations looking to streamline their data integration processes.

Loading