Skip to content
This repository has been archived by the owner on Jan 29, 2024. It is now read-only.

Commit

Permalink
Fix: content indentation and more (#2283)
Browse files Browse the repository at this point in the history
  • Loading branch information
ArthurFlag authored Nov 23, 2023
1 parent 274c6c6 commit 51a5e21
Show file tree
Hide file tree
Showing 40 changed files with 266 additions and 264 deletions.
4 changes: 0 additions & 4 deletions conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -312,10 +312,6 @@
:width: 24px
:class: no-scaled-link
.. |tick| image:: /images/icon-tick.png
:width: 24px
:class: no-scaled-link
.. |beta| replace:: :bdg-secondary:`beta`
.. |preview| replace:: :bdg-secondary:`preview`
Expand Down
29 changes: 10 additions & 19 deletions docs/integrations/rsyslog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,16 +28,16 @@ Client <https://github.com/aiven/aiven-client>`__ .
.. code::
avn service integration-endpoint-create --project your-project \
    -d example-syslog -t rsyslog \
    -c server=logs.example.com -c port=514 \
    -c format=rfc5424 -c tls=true
-d example-syslog -t rsyslog \
-c server=logs.example.com -c port=514 \
-c format=rfc5424 -c tls=true
When defining the remote syslog server the following parameters can be
applied using the ``-c`` switch.

Required:

- ``server`` -  DNS name or IPv4 address of the server
- ``server`` - DNS name or IPv4 address of the server

- ``port`` - port to connect to

Expand Down Expand Up @@ -100,17 +100,17 @@ endpoint previously created
.. code::
avn service integration-endpoint-list --project your-project
ENDPOINT_ID                           ENDPOINT_NAME   ENDPOINT_TYPE
====================================  ==============  =============
618fb764-5832-4636-ba26-0d9857222cfd  example-syslog  rsyslog
ENDPOINT_ID ENDPOINT_NAME ENDPOINT_TYPE
==================================== ============== =============
618fb764-5832-4636-ba26-0d9857222cfd example-syslog rsyslog
Then you can link the service to the endpoint

.. code::
avn service integration-create --project your-project \
    -t rsyslog -s your-service \
    -D 618fb764-5832-4636-ba26-0d9857222cfd
-t rsyslog -s your-service \
-D 618fb764-5832-4636-ba26-0d9857222cfd
Example configurations
----------------------
Expand All @@ -119,15 +119,6 @@ Rsyslog is a standard integration so you can use it with any external system. We

.. note:: All integrations can be configured using the Aiven Console or the Aiven CLI though the examples are easier to copy and paste in the CLI form.

* :ref:`Coralogix<rsyslog_coralogix>`
* :doc:`Datadog </docs/integrations/datadog/datadog-logs>`
* :ref:`Loggly<rsyslog_loggly>`
* :doc:`Logtail </docs/integrations/rsyslog/logtail>`
* :ref:`Mezmo<rsyslog_mezmo>`
* :ref:`New Relic<rsyslog_new_relic>`
* :ref:`Papertrail<rsyslog_papertrail>`
* :ref:`Sumo Logic<rsyslog_sumo_logic>`

.. _rsyslog_coralogix:

Coralogix
Expand Down Expand Up @@ -216,7 +207,7 @@ Papertrail
~~~~~~~~~~

As `Papertrail <https://www.papertrail.com/>`_ identifies the client based on
the server and port  you only need to copy the appropriate values from the
the server and port you only need to copy the appropriate values from the
"Log Destinations" page and use those as the values for ``server`` and ``port``
respectively. You **do not need** the ca-bundle as the Papertrail servers use
certificates signed by a known CA. You also need to set the format to
Expand Down
2 changes: 1 addition & 1 deletion docs/platform/concepts/choosing-timeseries-database.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ time series databases in its product portfolio.
**high availability.**

Find our more about our time series offerings on `our website
<https://aiven.io/time-series-databases/what-are-time-series-databases>`__ .
<https://aiven.io/time-series-databases/what-are-time-series-databases>`__.
6 changes: 3 additions & 3 deletions docs/platform/howto/create-service-integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,10 @@ Create an integration

This dashboard is a predefined view that is automatically maintained by Aiven.

.. note::
.. note::

It may take a minute to start getting data into to the dashboard view if you just enabled the integrations. The view can be refreshed by reloading in the top-right corner. You can add custom dashboards by either defining them from scratch in Grafana or by saving a copy of the predefined dashboard under a different name that does not start with *Aiven*.
It may take a minute to start getting data into to the dashboard view if you just enabled the integrations. The view can be refreshed by reloading in the top-right corner. You can add custom dashboards by either defining them from scratch in Grafana or by saving a copy of the predefined dashboard under a different name that does not start with *Aiven*.

.. warning::

Any changes that you make to the predefined dashboard are eventually automatically overwritten by the system.
Any changes that you make to the predefined dashboard are eventually automatically overwritten by the system.
32 changes: 16 additions & 16 deletions docs/platform/reference/project-member-privileges.rst
Original file line number Diff line number Diff line change
Expand Up @@ -44,28 +44,28 @@ You can grant different levels of access to project members using roles:
- Power services on/off
- Edit members and roles
* - Administrator
- |tick|
- |tick|
- |tick|
- |tick|
- |tick|
- |tick|
-
-
-
-
-
-
* - Operator
- |tick|
- |tick|
- |tick|
- |tick|
- |tick|
-
-
-
-
-
-
* - Developer
- |tick|
- |tick|
- |tick|
- |tick|
-
-
-
-
-
-
* - Read Only
- |tick|
-
-
-
-
Expand Down
5 changes: 3 additions & 2 deletions docs/products/flink/howto/connect-kafka.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,9 @@ To create a Apache Flink® table based on an Aiven for Apache Kafka® topic via
1. In the Aiven for Apache Flink service page, select **Application** from the left sidebar.
2. Create a new application or select an existing one with Aiven for Apache Kafka integration.

.. note::
If editing an existing application, create a new version to make changes to the source or sink tables.
.. note::

If editing an existing application, create a new version to make changes to the source or sink tables.

3. In the **Create new version** screen, select **Add source tables**.
4. Select **Add new table** or select **Edit** if you want to edit an existing source table.
Expand Down
3 changes: 2 additions & 1 deletion docs/products/flink/howto/connect-opensearch.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ To create an Apache Flink table based on an Aiven for OpenSearch® index via Aiv
2. Create a new application or select an existing one with Aiven for OpenSearch® integration.

.. note::
If editing an existing application, create a new version to make changes to the source or sink tables.

If editing an existing application, create a new version to make changes to the source or sink tables.

3. In the **Create new version** screen, select **Add sink tables**.

Expand Down
18 changes: 9 additions & 9 deletions docs/products/flink/howto/create-integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,21 @@ You can easily create Aiven for Apache Flink® data service integrations via the
1. Navigate to the Aiven for Apache Flink® service page.
2. If you are setting up the first integration for the selected Aiven for Apache Flink service, select **Get Started** in the service **Overview** screen.

.. image:: /images/products/flink/integrations-get-started.png
:scale: 80 %
:alt: Image of the Aiven for Apache Flink Overview page with focus on the Get Started Icon
.. image:: /images/products/flink/integrations-get-started.png
:scale: 80 %
:alt: Image of the Aiven for Apache Flink Overview page with focus on the Get Started Icon

3. To configure the data flow with Apache Flink®, select the Aiven for Apache Kafka®, Aiven for PostgreSQL®, or Aiven for OpenSearch® service that you wish to integrate. Click the **Integrate** button to complete the integration process.

.. image:: /images/products/flink/integrations-select-services.png
:scale: 50 %
:alt: Image of the Aiven for Apache Flink Integration page showing an Aiven for Apache Kafka® and an Aiven for PostgreSQL® services
.. image:: /images/products/flink/integrations-select-services.png
:scale: 50 %
:alt: Image of the Aiven for Apache Flink Integration page showing an Aiven for Apache Kafka® and an Aiven for PostgreSQL® services

4. You can include additional integrations using the plus(**+**) button in the **Data Flow** section

.. image:: /images/products/flink/integrations-add.png
:scale: 60 %
:alt: Image of the Aiven for Apache Flink Integration page showing an existing Aiven for Apache Kafka integration and the + icon to add additional integrations
.. image:: /images/products/flink/integrations-add.png
:scale: 60 %
:alt: Image of the Aiven for Apache Flink Integration page showing an existing Aiven for Apache Kafka integration and the + icon to add additional integrations



Expand Down
78 changes: 39 additions & 39 deletions docs/products/flink/howto/flink-confluent-avro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,46 +29,46 @@ Create an Apache Flink® table with Confluent Avro
5. In the **Add new source table** or **Edit source table** screen, select the Aiven for Apache Kafka® service as the integrated service.
6. In the **Table SQL** section, enter the SQL statement below to create an Apache Kafka®-based Apache Flink® table with Confluent Avro:

.. code:: sql
CREATE TABLE kafka (
-- specify the table columns
) WITH (
'connector' = 'kafka',
'properties.bootstrap.servers' = '',
'scan.startup.mode' = 'earliest-offset',
'topic' = 'my_test.public.students',
'value.format' = 'avro-confluent', -- the value data format is Confluent Avro
'value.avro-confluent.url' = 'http://localhost:8082', -- the URL of the schema registry
'value.avro-confluent.basic-auth.credentials-source' = 'USER_INFO', -- the source of the user credentials for accessing the schema registry
'value.avro-confluent.basic-auth.user-info' = 'user_info' -- the user credentials for accessing the schema registry
)
The following are the parameters:

* ``connector``: the **Kafka connector type**, between the **Apache Kafka SQL Connector** (value ``kafka``) for standard topic reads/writes and the **Upsert Kafka SQL Connector** (value ``upsert-kafka``) for changelog type of integration based on message key.

.. note::
For more information on the connector types and the requirements for each, see the articles on :doc:`Kafka connector types </docs/products/flink/concepts/kafka-connectors>` and :doc:`the requirements for each connector type </docs/products/flink/concepts/kafka-connector-requirements>`.

* ``properties.bootstrap.servers``: this parameter can be left empty since the connection details will be retrieved from the Aiven for Apache Kafka integration definition

* ``topic``: the topic to be used as a source for the data pipeline. If you want to use a new topic that does not yet exist, write the topic name.
* ``value.format``: indicates that the value data format is in the Confluent Avro format.

.. note::
The ``key.format`` parameter can also be set to the ``avro-confluent`` format.

* ``avro-confluent.url``: this is the URL for the Karapace schema registry.
* ``value.avro-confluent.basic-auth.credentials-source``: this specifies the source of the user credentials for accessing the Karapace schema registry. At present, only the ``USER_INFO`` value is supported for this parameter.
* ``value.avro-confluent.basic-auth.user-info``: this should be set to the ``user_info`` string you created earlier.
.. code:: sql
.. important::
To access the Karapace schema registry, the user needs to provide the username and password using the ``user_info`` parameter. The ``user_info`` parameter is a string formatted as ``user_info = f"{username}:{password}"``.

Additionally, on the source table, the user only needs read permission to the subject containing the schema. However, on the sink table, if the schema does not exist, the user must have write permission for the schema registry.

It is important to provide this information to authenticate and access the Karapace schema registry.
CREATE TABLE kafka (
-- specify the table columns
) WITH (
'connector' = 'kafka',
'properties.bootstrap.servers' = '',
'scan.startup.mode' = 'earliest-offset',
'topic' = 'my_test.public.students',
'value.format' = 'avro-confluent', -- the value data format is Confluent Avro
'value.avro-confluent.url' = 'http://localhost:8082', -- the URL of the schema registry
'value.avro-confluent.basic-auth.credentials-source' = 'USER_INFO', -- the source of the user credentials for accessing the schema registry
'value.avro-confluent.basic-auth.user-info' = 'user_info' -- the user credentials for accessing the schema registry
)
The following are the parameters:

* ``connector``: the **Kafka connector type**, between the **Apache Kafka SQL Connector** (value ``kafka``) for standard topic reads/writes and the **Upsert Kafka SQL Connector** (value ``upsert-kafka``) for changelog type of integration based on message key.

.. note::
For more information on the connector types and the requirements for each, see the articles on :doc:`Kafka connector types </docs/products/flink/concepts/kafka-connectors>` and :doc:`the requirements for each connector type </docs/products/flink/concepts/kafka-connector-requirements>`.

* ``properties.bootstrap.servers``: this parameter can be left empty since the connection details will be retrieved from the Aiven for Apache Kafka integration definition

* ``topic``: the topic to be used as a source for the data pipeline. If you want to use a new topic that does not yet exist, write the topic name.
* ``value.format``: indicates that the value data format is in the Confluent Avro format.

.. note::
The ``key.format`` parameter can also be set to the ``avro-confluent`` format.

* ``avro-confluent.url``: this is the URL for the Karapace schema registry.
* ``value.avro-confluent.basic-auth.credentials-source``: this specifies the source of the user credentials for accessing the Karapace schema registry. At present, only the ``USER_INFO`` value is supported for this parameter.
* ``value.avro-confluent.basic-auth.user-info``: this should be set to the ``user_info`` string you created earlier.

.. important::
To access the Karapace schema registry, the user needs to provide the username and password using the ``user_info`` parameter. The ``user_info`` parameter is a string formatted as ``user_info = f"{username}:{password}"``.

Additionally, on the source table, the user only needs read permission to the subject containing the schema. However, on the sink table, if the schema does not exist, the user must have write permission for the schema registry.

It is important to provide this information to authenticate and access the Karapace schema registry.

7. To create a sink table, select **Add sink tables** and repeat steps 4-6 for sink tables.
8. In the **Create statement** section, create a statement that defines the fields retrieved from each message in a topic.
Expand Down
18 changes: 10 additions & 8 deletions docs/products/flink/howto/manage-flink-tables.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,9 @@ Follow these steps add a new table to an application using the `Aiven Console <h
2. Select **Create new version**.
3. On the **Create new version** screen, navigate to the **Add source tables** or **Add sink tables** screen within your application.
4. Select **Add new table** to add a new table to your application.
.. note::
If you already have a sink table listed, you must delete it before adding a new one, only one sink table is allowed per job.

.. note::
If you already have a sink table listed, you must delete it before adding a new one, only one sink table is allowed per job.

5. Select the **Integrated service** from the drop-down list in the **Add new source table** or **Add new sink table** screen, respectively.
6. In the **Table SQL** section, enter the statement that will create the table. The interactive query feature if the editor will prompt you for error or invalid queries.
Expand All @@ -34,8 +35,9 @@ Import an existing table
Follow these steps import an existing table from another application:

1. In the **Add source tables** or **Add sink tables** screen, select **Import existing table** to import a table to your application.
.. note::
If you already have a sink table listed, you must delete it before importing a new one.

.. note::
If you already have a sink table listed, you must delete it before importing a new one.

2. From the **Import existing source table** or **Import existing sink table** screen:

Expand All @@ -53,14 +55,14 @@ Follow these steps to clone a table within an application:

1. In the **Add source tables** screen, locate the table you want to clone and click **Clone** next to it.

.. note::
Clone option is not available sink tables.
.. note::
Clone option is not available sink tables.

2. Select the **Integrated service** from the drop-down list.
3. In the **Table SQL** section, update the table name.

.. note::
You will not be able to add the table if there are errors within the statement.
.. note::
You will not be able to add the table if there are errors within the statement.

4. Select **Add table** to complete the process.

Expand Down
Loading

0 comments on commit 51a5e21

Please sign in to comment.