Skip to content

Commit

Permalink
Updated OML Notebook in Chat with Your Data in Autonomous Database Us…
Browse files Browse the repository at this point in the history
…ing Generative AI workshop (oracle-livelabs#368)

* Update introduction.md

* Update setup-workshop-environment.md

* Data Studio Workshop Changes

* changes to data studio workshop

* Update setup-workshop-environment.md

* adb changes

* Update recipient-diagram.png

* diagram change

* Update user-bucket-credential-diagram.png

* SME feedback

* Update create-share.md

* Nilay changes

* changes

* Update consume-share.md

* Anoosha's feedback

* Update consume-share.md

* updated 2 screens and a sentence

* minor changes

* deleted extra images and added doc references

* new ECPU changes

* more changes to data sharing workshops

* more changes to fork (data studio)

* more changes

* Marty's feedback

* Marty's feedback to plsql workshop too

* Update setup-workshop-environment.md

* Delete 7381.png

* workshop # 3 ADB set up

and a couple of minor typos in workshops 1 and 2

* changes to adb-dcat workshop

* more changes

* minor typos in all 4 workshops

* quarterly qa build data lake

* new lab 11 in build DL with ADW

* minor changes database actions drop-down list

* final changes to build data lake workshop

* AI updates

AI workshop updates

* ai workshop updates

* Update query-using-select-ai.md

* Update query-using-select-ai.md

* updates

* more updates

* Update query-using-select-ai.md

* more new updates to ai workshop

* Update query-using-select-ai.md

* a new screen capture

* push Marty's feedback to fork

Final changes.

* updates sandbox manifest

* updates

* restored sandbox manifest

* Update setup-environment.md

* updates after CloudWorld

* final updates to ai workshop (also new labs 4 and 5)

* marty's feedback

* incorporated feedback

* minor PR edits by Sarah

* removed steps 7 & 8 Lab 2 > Task 3 per Alexey

The customer asked to remove this as it's not a requirement for the bucket to be public.

* more changes

* more changes per Alexey

* Update load-os-data-public.md

* Quarterly QA

I added a new step per the PM's request in the Data Sharing PL/SQL workshop. I also made a minor edit (removed space) in the Data Lake workshop.

* more updates

* Quarterly QA changes

* Update consume-share.md

* minor edit based on workshop user

* quarterly qa November 2023

* Added new videos to the workshop

Replaced 3 old silent videos with new ones. Added two new videos.

* Adding important notes to the two data sharing workshops

Per the PM's request.

* folder structure only  push to production

This push and the PR later is to make sure the folder structure is in the production repo before I start development. Only 1 .md file and the workshops folder.

* typos

* cloud links workshop

* UPDATES

* Update query-view.png

* update

* minor updates to chat ai workshop (Fork)

* test clones

* test pr

* Alexey's feedback

* Update data-sharing-diagram.png

* sarah's edits

* changes to Data Load UI

* removed script causing ML issue

* Update load-local-data.md

* updates: deprecated procedure and new code

* updates and test

* more updates

* minor update

* testing using a building block in a workshop

* updates

* building blocks debugging

* Update manifest.json

* fixing issues

* Update manifest.json

* delete cleanup.md from workshop folder (use common file)

* use common cleanup.md instead of local cleanup.md

* test common tasks

* update data sharing data studio workshop

* Update create-recipient.png

* PM's 1 feedback

* quarterly qa

* missing "Lab 2" from Manifest

* always free note addition

added a note

* always free change

* Update setup-environment.md

* update manage and monitor workshop

* Folder structure for new data share workshop (plus introduction.md)

* Updated Load and Analyze from clone

* Data Lake minor changes from clone

* manage and monitor workshop

* Remove the lab from the workshop per Marty's request

* mark-hornick-feedback

* used marty's setup file

* replaced notebook with a new one

* updates to lab 6 of manage and monitor

* Update adb-auto-scaling.md

* Nilay's feedback

* Update adb-auto-scaling.md

* updates to second ai workshop

* note change

* Changes to Load and Analyze workshop (other minor changes too)

* quarterly qa

* Update diagrams per Alexey (remove delta share icon)

* updated the 15-minutes workshop

* Update analyzing-movie-sales-data.md

* ords updates and misc

* updated data studio workshop

* ORDS and Misc updates

* updated freetier version

* updated livelabs version

* updating the manage and monitor workshop

* more updates

* lab 11 updates

* updated lab 14

* updated freetier

* more updates

* Update adw-connection-wallet.md

* update

* Create purge-scn.png

* livelabs updates

* Update adb-flashback.md

* final updates

* updated screens Ramona's review

* Update click-add-peer-database-second-time.png

* update the adb-dcat workshop

1. New ord 24.1.0 launchpad.
2. New navigation path to create dynamic groups
3. Updated OML UI

* Update see-clone-information-in-details-page-2.png

* Requested changes to the Data Lake workshop

* more updates

* updates to Data Lake workshop

* kscope24 workshop for Alexey

* new lab & other updates

* Update load-os-data-private.md

* Update load-os-data-private.md

* more updates, new lab

* minor update example 2

* Update load-os-data-private.md

* Chat AI workshop changes

* new notebook

* more updates

* minor updates

* New Iceberg Lab

added to the freetier and livelabs workshops

* Update manifest.json

* Update query-iceberg-tables.md

* new livelabs folder to request green button

* folder structure for two more data sharing workshops - green button

* 3rd workshop

* NEW LIVE DATA SHARE WORKSHOP

* Anne's feedback

* update

* Update introduction.md

* changes to manifest files and setup lab

* updated AI Chat workshop

* feedback

* adding the missing lab 4 to the workshop

* Update introduction.md

* Update introduction.md

* notebook update

* Update query-using-select-ai.md

* updated notebook

---------

Co-authored-by: Michelle Malcher <48925485+malcherm@users.noreply.github.com>
Co-authored-by: Sarah Hirschfeld <107423288+shirschf@users.noreply.github.com>
  • Loading branch information
3 people authored Oct 29, 2024
1 parent bcf4e6d commit bdaf531
Show file tree
Hide file tree
Showing 10 changed files with 1,921 additions and 337 deletions.
Binary file modified shared/adb-speaks-human/integrate-genai/images/create-llama.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
110 changes: 46 additions & 64 deletions shared/adb-speaks-human/integrate-genai/integrate-genai.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,71 +80,49 @@ You can create as many profiles as you need, which is useful when comparing the

For a complete list of the Select AI profile attributes, see the [DBMS\_CLOUD\_AI\_Package] (https://docs.oracle.com/en/cloud/paas/autonomous-database/serverless/adbsb/dbms-cloud-ai-package.html#GUID-D51B04DE-233B-48A2-BBFA-3AAB18D8C35C) in the Using Oracle Autonomous Database Serverless documentation.

>**Note:** The deployment script created a Select AI profile using the code below:
>**Note:** The deployment script that you ran in **Lab 1 > Task 1 > Step 1** created a `Select AI` profile named **genai** using the code shown in this task; however, you'll practice dropping the file and then recreating it. For the **region** parameter, specify the region name where the OCI GenAI service is running and to which your tenancy is subscribed. In our example, we used **us-chicago-1** (the default). If you are subscribed to **Frankfurt**, then use `eu-frankfurt-1` as the value for the **region** parameter. If you are subscribed to **London**, then use `uk-london-1` as the value for the **region** parameter.
1. Sign into the SQL worksheet as the **`MOVIESTREAM`** user with the password **`watchS0meMovies#`**. On the **Database Actions Launchpad** page, click the **Development** tab, and then click the **SQL** tab. The SQL Worksheet is displayed.

>**Note:** the **`MOVIESTREAM`** user was created as part of the setup and tables that were created in that schema. You can find the Moviestream password by navigating to **Developer Services** from the Navigation menu. Next, click **Resource Manager** > **Stacks** > Select the stack we created, **Deploy-ChatDB-Autonomous-Database...** > Select the job we created, **ormjob2024117214431** (use your own stack number)> Select **Outputs** under **Resources**.
![Moviestream password](./images/moviestream-output-pswd.png "")

2. Create an AI profile for the **Meta Llama 3** model. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.

```
begin
<copy>
BEGIN
-- drops the profile if it already exists
DBMS_CLOUD_AI.drop_profile(
profile_name => 'genai',
force => true
);
-- Create an AI profile that uses the default LLAMA model on OCI
dbms_cloud_ai.create_profile(
-- Meta Llama 3 (this is the default model, so you could skip the model attribute if you like)
DBMS_CLOUD_AI.create_profile (
profile_name => 'genai',
attributes =>
'{"provider": "oci",
attributes =>
'{"provider": "oci",
"credential_name": "OCI$RESOURCE_PRINCIPAL",
"comments":"true",
"object_list": [
{"owner": "MOVIESTREAM", "name": "GENRE"},
{"owner": "MOVIESTREAM", "name": "CUSTOMER"},
{"owner": "MOVIESTREAM", "name": "PIZZA_SHOP"},
{"owner": "MOVIESTREAM", "name": "STREAMS"},
{"owner": "MOVIESTREAM", "name": "MOVIES"},
{"owner": "MOVIESTREAM", "name": "ACTORS"}
]
}'
);
end;
/
{"owner": "moviestream", "name": "GENRE"},
{"owner": "moviestream", "name": "CUSTOMER"},
{"owner": "moviestream", "name": "PIZZA_SHOP"},
{"owner": "moviestream", "name": "STREAMS"},
{"owner": "moviestream", "name": "MOVIES"},
{"owner": "moviestream", "name": "ACTORS"}
],
"region": "us-chicago-1"
}');
END;
/
</copy>
```

1. Sign into the SQL worksheet as the **`MOVIESTREAM`** user with the password **`watchS0meMovies#`**. On the **Database Actions Launchpad** page, click the **Development** tab, and then click the **SQL** tab. The SQL Worksheet is displayed.

>**Note:** the **`MOVIESTREAM`** user was created as part of the setup and tables that were created in that schema. You can find the Moviestream password by navigating to **Developer Services** from the Navigation menu. Next, click **Resource Manager** > **Stacks** > Select the stack we created, **Deploy-ChatDB-Autonomous-Database...** > Select the job we created, **ormjob2024117214431** (use your own stack number)> Select **Outputs** under **Resources**.
![Create AI profile](./images/create-llama.png "")

![Moviestream password](./images/moviestream-output-pswd.png "")

2. Create an AI profile for the **Meta Llama 3 model**. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.

```
<copy>
BEGIN
-- drops the profile if it already exists
DBMS_CLOUD_AI.drop_profile(
profile_name => 'ociai_llama',
force => true
);
-- Meta Llama 3 (this is the default model, so you could skip the model attribute if you like)
DBMS_CLOUD_AI.create_profile (
profile_name => 'ociai_llama',
attributes =>
'{"provider": "oci",
"credential_name": "OCI$RESOURCE_PRINCIPAL",
"object_list": [
{"owner": "moviestream", "name": "GENRE"},
{"owner": "moviestream", "name": "CUSTOMER"},
{"owner": "moviestream", "name": "PIZZA_SHOP"},
{"owner": "moviestream", "name": "STREAMS"},
{"owner": "moviestream", "name": "MOVIES"},
{"owner": "moviestream", "name": "ACTORS"}
],
"model": "meta.llama-3-70b-instruct"
}');
END;
/
</copy>
```
![Create AI profile](./images/create-llama.png "")
<!--
3. Create an AI profile for the **Cohere model**. This model will not be used for SQL generation - it will only be used for generating innovative content. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.
Expand All @@ -167,25 +145,27 @@ end;
```
![Create AI profile](./images/create-cohere.png "")
-->

## Task 3: Test the AI profile

We will use the PL/SQL API to generate a response from the Cohere model. This example is using the **chat** action. It is not using any private data coming from your database.
We will use the PL/SQL API to generate a response from the **Meta Llama 3** model. This example is using the **chat** action. It is not using any private data coming from your database.

1. Test the LLM and learn about Autonomous Database as the MOVIESTREAM user using the **Cohere model**. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.
1. Test the LLM and learn about Autonomous Database as the MOVIESTREAM user using the **genai** model. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.

```
<copy>
SELECT DBMS_CLOUD_AI.GENERATE(
prompt => 'what is oracle autonomous database',
profile_name => 'OCIAI_COHERE',
profile_name => 'genai',
action => 'chat')
FROM dual;
</copy>
```
![Test the LLM](./images/cohere-output.png "")
![Test the LLM](./images/genai-output.png "")

2. Compare the Cohere model to the **Llama model**. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.
<!--
2. Compare the **genai** model to the **Llama** model. Copy and paste the following code into your SQL Worksheet, and then click the **Run Script** icon.
```
<copy>
Expand All @@ -197,6 +177,7 @@ We will use the PL/SQL API to generate a response from the Cohere model. This ex
</copy>
```
![Generate sentence-like response](./images/llama-chat.png "")
-->

## Summary
You learned how to integrate Autonomous Database with OCI Generative AI. And, you chatted with different models hosted on OCI Generative AI. Next, let's see how to use our private data with LLMs.
Expand All @@ -211,16 +192,17 @@ You may now proceed to the next lab.

## Acknowledgements

* **Author:** Lauran K. Serhal, Consulting User Assistance Developer
* **Contributors:**
* **Authors:**
* Lauran K. Serhal, Consulting User Assistance Developer
* Marty Gubar, Product Management
* **Contributors:**
* Stephen Stuart, Cloud Engineer
* Nicholas Cusato, Cloud Engineer
* Olivia Maxwell, Cloud Engineer
* Taylor Rees, Cloud Engineer
* Joanna Espinosa, Cloud Engineer

* **Last Updated By/Date:** Lauran K. Serhal, July 2024
* **Last Updated By/Date:** Lauran K. Serhal, October 2024

Data about movies in this workshop were sourced from **Wikipedia**.

Expand Down
8 changes: 4 additions & 4 deletions shared/adb-speaks-human/introduction/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@

In this workshop, you will learn how to use **Autonomous Database Select AI (Select AI)** to query your data using natural language; you don't need prior knowledge of the data structure or how that data is accessed. Next, you'll use those capabilities in a voice-enabled APEX app that enables you to get answers to your questions from your desktop or mobile device.

> **NOTE:** Your tenancy must be subscribed to the **US Midwest (Chicago)** region in order to run this workshop. See the [OCI documentation](https://docs.oracle.com/en-us/iaas/Content/Identity/Tasks/managingregions.htm) for more details.
> **NOTE:** At the time this workshop was last updated, your tenancy must be subscribed to the **US Midwest (Chicago)**, **Germany Central (Frankfurt)**, or **UK South (London)** regions in order to run this workshop. See the [OCI documentation](https://docs.oracle.com/en-us/iaas/Content/Identity/Tasks/managingregions.htm) for more details.
### What is Natural Language Processing?

Expand Down Expand Up @@ -36,8 +35,9 @@ You may now proceed to the next lab.
* [Additional Autonomous Database Tutorials](https://docs.oracle.com/en/cloud/paas/autonomous-data-warehouse-cloud/tutorials.html)

## Acknowledgements
* **Author:** Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:** Marty Gubar, Product Manager
* **Authors:**
* Lauran K. Serhal, Consulting User Assistance Developer
* Marty Gubar, Product Manager
* **Last Updated By/Date:** Lauran K. Serhal, October 2024

Data about movies in this workshop were sourced from **Wikipedia**.
Expand Down
Loading

0 comments on commit bdaf531

Please sign in to comment.