Pakistan's First Oracle Blog

Subscribe to Pakistan's First Oracle Blog feed
Blog By Fahd Mirza ChughtaiFahd Mirzahttp://www.blogger.com/profile/14722451950835849728noreply@blogger.comBlogger667125
Updated: 4 hours 51 min ago

AI Web Scraping for Free with DeepSeek R1 Locally with Crawl4AI and Ollama

Thu, 2025-02-06 22:19

 This video shows how to do AI web-scraping with DeepSeek R1 locally with Ollama and Crawl4AI easily.




Code:

deepseekwebscraper/ at main · fahdmirza/deepseekwebscraper

Categories: DBA Blogs

Getting Data LLM-Ready with JSON in Oracle Database

Thu, 2025-02-06 16:25

 If you are building an AI-powered application, especially with tools and function calling you know that using JSON could greatly improve the accuracy of your application and the LLM would be able to give more grounded response. That is where this blog post is helping to show you how easy and powerful it is to use JSON with Oracle PL/SQL, which is still very much relevant in today's AI world.

Oracle Database provides native support for JavaScript Object Notation (JSON) data, allowing you to store, index, and query JSON data using standard SQL and PL/SQL.

Benefits of Using JSON in Oracle Database

  • Schemaless development: Quickly react to changing application requirements without needing to change storage schemas.
  • Flexible data analysis and reporting: Leverage the power of SQL and relational databases for complex data analysis and reporting.
  • Rock-solid data protection and access control: Ensure data integrity and security with Oracle Database's robust features.

JSON data can be stored, indexed, and queried without defining a schema. Oracle Database supports JSON natively, providing features like transactions, indexing, declarative querying, and views. JSON data is stored using standard SQL data types such as VARCHAR2, CLOB, and BLOB. It is recommended to use an is_json check constraint to ensure column values are valid JSON instances.

PL/SQL supports SQL code, including SQL code that accesses JSON data. You can use SQL/JSON functions and conditions as built-in PL/SQL functions.

Additionally, PL/SQL provides object types for JSON, allowing for fine-grained construction and manipulation of in-memory JSON data.

Let's say we have a JSON object that represents a list of books:

declare
  v_json      clob;
  v_parsed    json_object_t;
  v_books     json_array_t;
  v_book      json_object_t;
  v_title     varchar2(100);
  v_author    varchar2(100);
  v_price     number;

begin
  -- Load JSON Data
  v_json := '{
    "books": [
      {
        "title": "Book 1",
        "author": "Author 1",
        "price": 10.99
      },
      {
        "title": "Book 2",
        "author": "Author 2",
        "price": 9.99
      },
      {
        "title": "Book 3",
        "author": "Author 3",
        "price": 12.99
      }
    ]
  }';

  -- Parse JSON
  v_parsed := json_object_t.parse(v_json);
  v_books := v_parsed.get_array('books');

  -- Loop through books
  for i in 1 .. v_books.get_size
  loop
    v_book := v_books.get_object(i);
    v_title := v_book.get_string('title');
    v_author := v_book.get_string('author');
    v_price := v_book.get_number('price');

    -- Output book details
    dbms_output.put_line(v_title || ' by ' || v_author || ', Price: ' || v_price);
  end loop;
end;
/


The output of this script would be:

Book 1 by Author 1, Price: 10.99
Book 2 by Author 2, Price: 9.99
Book 3 by Author 3, Price: 12.99

By leveraging Oracle Database's native support for JSON data, you can efficiently store, query, and analyze JSON data using standard SQL and PL/SQL.

Categories: DBA Blogs

How-To Speedup Initial Load from Oracle to SQL Server with Oracle GoldenGate 23ai

Sun, 2025-02-02 01:03

 Have you ever stared at your computer screen, watching the hours tick by as you wait for a massive data transfer to complete? If you're using Oracle GoldenGate to move data from an Oracle database to SQL Server, you might know the feeling all too well—especially when you're dealing with something like 200 million rows that take way too long. Let’s dive into how Oracle GoldenGate works and some simple tricks to speed up that initial load time.

What is Oracle GoldenGate?

First things first—what is Oracle GoldenGate? In simple terms, it's like a super-efficient courier service that gets your data from one place to another. It specializes in real-time data integration and replication, making sure every bit of information moves swiftly and accurately from your Oracle database to whatever destination you've chosen, like SQL Server, in this case. It's especially handy for businesses that need their data synchronized quickly and continuously.

Making the Initial Load Faster

Now, onto the good part: how can you speed up the initial load that seems to take forever? The good news is that there are several strategies you can use to make the process more efficient.

Using the BATCHSQL parameter is one approach. This allows you to bundle multiple SQL insert statements together, reducing the overall time spent on these operations. Creating a unique index on your target SQL Server for inserts is another useful tip. This helps your database manage the incoming data more efficiently, cutting down on the time it takes to sort and place the records.

Splitting your data into smaller batches is also a great way to speed things up. Instead of overwhelming your system with 200 million rows all at once, use the Range function or Column Filters to divide the data into more manageable chunks. This approach is especially useful for large tables, as it allows you to tackle the data in sections rather than trying to handle it all at once.

Finally, if you're working with multiple database instances that are all on the same version, consider creating multiple extracts that can connect to these different instances. This allows you to distribute the workload across various sources, which can significantly speed up the entire process.

By implementing these strategies, you can make your initial load process faster, smoother, and less stressful. Whether you're dealing with 200 million rows or even more, these tips can help you get the job done more efficiently.

Categories: DBA Blogs

How-To Fix Poco::IOException Error During Goldengate 23ai Upgrade

Fri, 2025-01-31 00:51

 I recently upgraded Oracle GoldenGate 23ai. In this post, I'll share my experience and provide a step-by-step guide on how to upgrade Oracle GoldenGate 23ai using the GUI and especially a weird error which I received during this upgrade which is as follows:

Error:

Operating system character set identified as UTF-8.
terminate called after throwing an instance of 'Poco::IOException'
what(): I/O error

If you want TLDR; then I had to apply patch 27788241 to get this resolved. For details, keep reading on.


To start the upgrade, I downloaded the latest Oracle GoldenGate 23ai software from the Oracle Technology Network or eDelivery. Then, I moved the software to a staging folder and unzipped it.


For Linux users, the commands are:

$ mv /home/user/fbo_ggs_Linux_x64_Oracle_services_shiphome.zip /tmp
$ cd /tmp
$ unzip fbo_ggs_Linux_x64_Oracle_services_shiphome.zip


Next, I uploaded the Oracle GoldenGate 23ai software to a staging location on the server where the previous release of Oracle GoldenGate existed.


Upgrading the Service Manager


After installing the latest Oracle GoldenGate 23ai version, I upgraded the Service Manager. I logged into the Service Manager using the URL: https://hostname:servicemanager_port.


From the Service Manager Overview page, I selected the ServiceManager link in the Deployments section. Then, I clicked the pencil icon next to the Deployment Detail section to open the dialog box for editing the GoldenGate home path.


I updated the GoldenGate Home path with the full path to the new Oracle GoldenGate 23ai home and clicked Apply. Finally, I restarted the Service Manager using the Action dropdown.


Upgrading the Deployment


To upgrade the deployment, I stopped all Extract and Replicat processes. I checked for open transactions and Bounded Recovery.


Then, I updated the deployment with the location of the new Oracle GoldenGate 23ai Home directory. I edited the deployment details and updated the Oracle GoldenGate 23ai Home path.


Resolving the Error


During the upgrade, I got this error:


Operating system character set identified as UTF-8.
terminate called after throwing an instance of 'Poco::IOException'
what(): I/O error


After researching, I found that applying patch 27788241 fixed the issue. I applied the patch, and the upgrade completed successfully.


Hope this helps.

Categories: DBA Blogs

DeepSeek R1 Coding Examples - Easy Tutorial for Beginners and Experts

Sun, 2025-01-26 19:16

 This video gives a hands-on tutorial to use DeepSeek R1 model in code for various use-cases.



Code:

fahdmirza/deepseekr1: This repo contains examples of using DeepSeek R1 Reasoner model in Python Code with API

Categories: DBA Blogs

Oracle Scheduler SQL Cheat Sheet

Tue, 2025-01-21 21:42

 Oracle Scheduler is a built-in job scheduler in Oracle Database that enables you to manage and execute various tasks, such as running database program units, external executables, and scripts. It provides a flexible and sophisticated way to schedule jobs based on time, events, or dependencies, allowing you to automate routine tasks and reduce manual intervention.

The Scheduler offers advanced features, including prioritization of jobs based on business requirements, resource allocation, and monitoring of job execution. It also supports execution of jobs in a clustered environment, such as Oracle Real Application Clusters (Oracle RAC). With Oracle Scheduler, you can streamline your database operations, improve reliability, and reduce operating costs.


You can use following SQLs to manage your Scheduler and the jobs:


Check Scheduler Job Details in CDB
SQL
-- View job details in the Container Database (CDB)
SELECT
  CON_ID,
  JOB_NAME,
  JOB_TYPE,
  ENABLED,
  STATE,
  NEXT_RUN_DATE,
  REPEAT_INTERVAL
FROM
  cdb_scheduler_jobs;
Monitor Currently Running Jobs
SQL
-- View currently running jobs
SELECT
  job_name,
  session_id,
  running_instance,
  elapsed_time
FROM
  dba_scheduler_running_jobs;
View Job Run Details
SQL
-- View details of job runs
SELECT
  *
FROM
  DBA_SCHEDULER_JOB_RUN_DETAILS;
View Job-Related Logs
SQL
-- View logs related to job execution
SELECT
  *
FROM
  DBA_SCHEDULER_JOB_LOG;
Check All Scheduler Schedules
SQL
-- Set formatting options for output
SET PAGESIZE 200
SET LINES 299
COL START_DATE FOR A45
COL REPEAT_INTERVAL FOR A45
COL schedule_name FOR A34

-- View all scheduler schedules
SELECT
  schedule_name,
  schedule_type,
  start_date,
  repeat_interval
FROM
  dba_scheduler_schedules;
History of All Scheduler Job Runs
SQL
-- Set formatting options for output
SET PAGESIZE 299
SET LINES 299
COL JOB_NAME FOR A24
COL actual_start_date FOR A56
COL RUN_DURATION FOR A34

-- View history of all scheduler job runs
SELECT
  job_name,
  status,
  actual_start_date,
  run_duration
FROM
  DBA_SCHEDULER_JOB_RUN_DETAILS
ORDER BY
  ACTUAL_START_DATE DESC;
Check Log Information for All Scheduler Jobs
SQL
-- Set formatting options for output
SET PAGESIZE 299
SET LINES 299
COL job_name FOR A24
COL log_date FOR A40
COL operation FOR A19
COL additional_info A79

-- View log information for all scheduler jobs
SELECT
  job_name,
  log_date,
  status,
  OPERATION,
  ADDITIONAL_INFO
FROM
  dba_scheduler_job_log
ORDER BY
  log_date DESC;
Check All Scheduler Windows Details
SQL
-- Set formatting options for output
SET PAGESIZE 300
SET LINESIZE 200

-- View all scheduler windows details
SELECT
  *
FROM
  dba_scheduler_windows;
Categories: DBA Blogs

SOLVED - ORA-20000: ORA-24247: Network access denied by access control list (ACL) with Select AI

Wed, 2025-01-15 00:11

 I have been experimenting with Oracle Select AI for couple of months now and it has become one of my favorite tool. The other day, I encountered an error while playing with it and it took me sometime to find a fix, so thought of sharing it with you guys. 

If you don't know what Oracle Select AI is, here is a quick simple intro:


Oracle Select AI is a tool that lets you interact with your database using everyday language instead of complicated SQL code. It uses artificial intelligence (AI) to understand what you're asking and generate the right SQL queries for you. This makes it easier for everyone, regardless of their technical expertise, to get insights from their data and develop AI-based applications.


Now we know what Oracle Select AI is, lets have a look at error:


I was trying to integrate Select AI with Oracle APEX. Oracle Application Express (Oracle APEX) is a low-code application development platform that enables users to build scalable, secure enterprise applications. It is a web-based integrated development environment (IDE) that runs as part of the Oracle Database.


But when I was trying to configure profile with OpenAI's model like following:


BEGIN

    DBMS_CLOUD_AI.CREATE_PROFILE(

        profile_name => 'OPENAI_ORACLE',

        attributes => '{ "provider": "openai",

                         "credential_name": "OPENAI_CRED",

                         "object_list": [{"owner": "SCOTT", "name": "MYTABLE"}]

                       }',

        description => 'AI profile to use OpenAI with Oracle APEX'

    );

END;


I was getting following error:


ORA-20000: ORA-24247: Network access denied by access control list (ACL)

ORA-06512: en "C##CLOUD$SERVICE.DBMS_CLOUD$PDBCS_240705_0", línea 2064

ORA-06512: en "C##CLOUD$SERVICE.DBMS_CLOUD_AI", línea 5674

ORA-06512: en línea 2 Error at Line: 7 Column: 0


After much ado, it turned out that I had missed following step to grant network access:


BEGIN

    DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(

          HOST => 'api.openai.com',

          ACE => XS$ACE_TYPE(PRIVILEGE_LIST => XS$NAME_LIST('http'), 

                             PRINCIPAL_NAME => 'SCOTT',

                             PRINCIPAL_TYPE => XS_ACL.PTYPE_DB)

);

END;


As soon as I run above network ACL command, it worked like a charm.


Hope that helps.

Categories: DBA Blogs

NVIDIA SANA Model Local Installation with GUI - Step-by-Step Tutorial

Mon, 2025-01-13 17:48

 This video locally installs NVIDIA SANA which is a text-to-image framework that can efficiently generate images up to 4096 × 4096 resolution.


Code:

git clone https://github.com/NVlabs/Sana.git && cd Sana

./environment_setup.sh sana

pip install huggingface_hub

huggingface-cli login  <Get read token from huggingface.co also accept access to
google gemma model on huggingface>

# official online demo
DEMO_PORT=15432 \
python3 app/app_sana.py \
    --share \
    --config=configs/sana_config/1024ms/Sana_1600M_img1024.yaml \
    --model_path=hf://Efficient-Large-Model/Sana_1600M_1024px/checkpoints/Sana_1600M_1024px.pth \
    --image_size=1024
   
Access demo at http://localhost:15432
Categories: DBA Blogs

How to Use OpenAI models with Oracle Select AI

Tue, 2025-01-07 23:54

 If you want to use OpenAI models with Oracle databases then Oracle Select AI makes it a breeze to do so. Provided you already have an Oracle autonomous database in OCI, you can use following steps to use OpenAI's model to use natural language to query you database with LLMs from  OpenAI. 

Oracle Autonomous Database Select AI is a powerful tool that enables you to leverage AI capabilities directly within your database environment, and OpenAI is a leading AI model provider. Also, as OpenAI is a paid option, so you would need to grab your OpenAI's API key from platform.openai.com. By default, Oracle Select AI uses gpt-3.5-turbo (default) from OpenAI but you can select any model from below list:


  • gpt-3.5-turbo (default)
  • gpt-4o
  • gpt-4o-mini
  • gpt-4
  • gpt-4-0613
  • gpt-4-32k
  • gpt-4-32k-0613
  • gpt-3.5-turbo-0613
  • gpt-3.5-turbo-16k
  • gpt-3.5-turbo-16k-0613


Lets first create a user:

-- 

CREATE USER SCOTT IDENTIFIED BY "SelectAI25#TEST";

GRANT RESOURCE TO SCOTT;

GRANT CREATE SESSION TO SCOTT;

GRANT CREATE VIEW TO SCOTT;

GRANT CREATE TABLE TO SCOTT;

GRANT CONNECT TO SCOTT;


--Grants EXECUTE privilege to SCOTT

--

SQL> grant execute on DBMS_CLOUD_AI to SCOTT;


-- Grant Network ACL for OpenAI endpoint

--


SQL> BEGIN  

     DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(

         host => 'api.openai.com',

         ace  => xs$ace_type(privilege_list => xs$name_list('http'),

                             principal_name => 'SCOTT',

                             principal_type => xs_acl.ptype_db)

     );

    END;

    /

 

--

-- Create Credential for AI provider

--


EXEC

DBMS_CLOUD.CREATE_CREDENTIAL(

CREDENTIAL_NAME   => 'OPENAI_CRED', 

username          =>  'OPENAI', 

password          =>  '<OPENAI_API_KEY>');

 

--

-- Create AI profile

--


BEGIN

  DBMS_CLOUD_AI.CREATE_PROFILE(

  profile_name   => 'OPENAI',

  attributes     =>'{"provider": "openai",                                                                   

        "credential_name": "OPENAI_CRED",                                     

        "object_list": [{"owner": "SCOTT", "name": "MYTABLE"},                

        "conversation": "true"                

       }');                                                                  

     END;                                                                         

     / 

 

--

-- Enable AI profile in current session

--


SQL> EXEC DBMS_CLOUD_AI.SET_PROFILE('OPENAI');

 

--

-- Use Select AI

--


SQL> select ai how many rows exist in table;

 

SQL> select ai how many users in Jakarta are jobless;   


-- You can also show the actual SQL with the result:

 

SQL> select ai showsql how many users in Jakarta are jobless;

 

Categories: DBA Blogs

How-To Create Lip Sync Video with AI - Change Any Video or Audio Locally and Free

Mon, 2025-01-06 17:59

This video shows how to locally install Latent Sync AI model to lip sync any video and audio for free and in private.



Code:

git clone https://github.com/bytedance/LatentSync.git && cd LatentSync

source setup_env.sh

./inference.sh

Categories: DBA Blogs

Install NVIDIA Ingest Locally and Use it with Thousands of Documents

Sat, 2025-01-04 01:24

 This video shares step-by-step instructions to install NVIDIA Ingest locally and use it with PDFs, Word, and PowerPoint.


Code:



Pre-requisites:
===============

-- Install docker
-- Get NGC api key from https://ngc.nvidia.com/
-- Get Early Access from https://developer.nvidia.com/nemo-microservices-early-access/join

Phase 1= Configure NV-INGEST Server:
====================================

Step 1:

git clone https://github.com/nvidia/nv-ingest && cd nv-ingest

Step 2:

docker login nvcr.io

Username: $oauthtoken
Password: <Your NGC API Key>

Step 3:

Make sure NVIDIA is set as your default container runtime before running the docker compose command:
sudo nvidia-ctk runtime configure --runtime=docker --set-as-default

Step 4:

docker compose up


Phase 2= Configure NV-INGEST client:
====================================

Step 1:


conda create --name nv-ingest-dev --file ./conda/environments/nv_ingest_environment.yml
conda activate nv-ingest-dev

cd client
pip install .

Step 2:

nv-ingest-cli \
  --doc ./data/multimodal_test.pdf \
  --output_directory ./processed_docs \
  --task='extract:{"document_type": "pdf", "extract_method": "pdfium", "extract_tables": "true", "extract_images": "true"}' \
  --client_host=localhost \
  --client_port=7670

 
Where to find output?
======================

After the ingestion steps above have completed, you should be able to find text and image subfolders inside your processed docs folder. Each will contain JSON formatted extracted content and metadata.

  ls -R processed_docs
Categories: DBA Blogs

Oracle Select AI and DBMS_CLOUD_AI in Simple Words

Tue, 2024-12-31 23:36

Oracle Select AI enables you to elevate your productivity and develop innovative AI-based applications by interacting with your database and Large Language Models (LLMs) using natural language through SQL.

Select AI leverages generative AI to automate and simplify various tasks, including:

  • Generating, running, and explaining SQL queries from natural language prompts
  • Retrieval augmented generation using vector stores
  • Synthetic data generation
  • Conversing with LLMs
  • Seamless Natural Language Interaction

With Select AI, Autonomous Database effortlessly converts natural language into SQL. This enables you to interact with your data using natural language prompts instead of SQL code.

Select AI serves as a productivity tool for both expert and non-expert SQL users, enabling them to:

  • Derive valuable insights from data without requiring extensive technical knowledge
  • Automate the retrieval augmented generation process
  • Utilize features like synthetic data generation, chat history support, and more from a SQL interface

The DBMS_CLOUD_AI package facilitates integration with user-specified LLMs, enabling natural language to SQL generation. This package provides an augmented prompt to the LLM, containing relevant database schema metadata, to generate, run, and explain SQL queries based on natural language prompts.

At the moment, following models are supported by Select AI:

  • meta.llama-3.1-70b-instruct (default)
  • meta.llama-3.1-405b-instruct
  • meta.llama-3.2-90b-vision-instruct
  • cohere.command-r-16k (deprecated)
  • cohere.command–r-plus (deprecated)
  • cohere.command-r-08-2024
  • cohere.command-r-plus-08-2024
  • GPT-4o
  • GPT-4
  • GPT-4 Turbo with Vision
  • GPT-3.5-Turbo
  • command (default)
  • command-nightly (experimental)
  • command-r
  • command-r-plus
  • command-light
  • Mixtral-8x7B-Instruct-v0.1 (default)
  • Meta-Llama-3-70B-Instruct
  • Qwen1.5-1.8B
  • claude-3-5-sonnet-20240620 (default)
  • claude-3-opus-20240229
  • claude-3-sonnet-20240229
  • gpt-3.5-turbo (default)
  • gpt-4o
  • gpt-4o-mini
Categories: DBA Blogs

Install Kokoro TTS Model Locally

Mon, 2024-12-30 01:50

 This video locally installs Kokoro which is a frontier TTS model for its size of 82 million parameters. It can be run anywhere.





!git clone https://huggingface.co/hexgrad/Kokoro-82M
%cd Kokoro-82M

!apt-get -qq -y install espeak-ng > /dev/null 2>&1
!pip install -q phonemizer torch transformers scipy munch

from models import build_model
import torch
device = 'cuda' if torch.cuda.is_available() else 'cpu'
MODEL = build_model('kokoro-v0_19.pth', device)

VOICE_NAME = [
    'af',
    'af_bella', 'af_sarah', 'am_adam', 'am_michael',
    'bf_emma', 'bf_isabella', 'bm_george', 'bm_lewis',][0]

VOICEPACK = torch.load(f'voices/{VOICE_NAME}.pt', weights_only=True).to(device)
print(f'Loaded voice: {VOICE_NAME}')

from kokoro import generate
text = "How could I know? It's an unanswerable question. Like asking an unborn child if they'll lead a good life. They haven't even been born."
audio, out_ps = generate(MODEL, text, VOICEPACK, lang=VOICE_NAME[0])

from IPython.display import display, Audio
display(Audio(data=audio, rate=24000, autoplay=True))
print(out_ps)
Categories: DBA Blogs

MagicQuill Installation on Windows, Linux, Mac for AI Image Editing for Free

Fri, 2024-12-20 19:22

 This video is an easy step-by-step tutorial to install MagicQuill locally on Linux, Windows, Mac.


Code:

conda create -n ai python=3.10 -y && conda activate ai

git clone --recursive https://github.com/magic-quill/MagicQuill.git && cd MagicQuill

wget -O models.zip "https://hkustconnect-my.sharepoint.com/:u:/g/personal/zliucz_connect_ust_hk/EWlGF0WfawJIrJ1Hn85_-3gB0MtwImAnYeWXuleVQcukMg?e=Gcjugg&download=1"

unzip models.zip

pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu118

pip install gradio_magicquill-0.0.1-py3-none-any.whl

cp -f pyproject.toml MagicQuill/LLaVA/
pip install -e MagicQuill/LLaVA/

pip install -r requirements.txt

python gradio_run.py
Categories: DBA Blogs

SOLVED - Cannot Log in Oracle cloud with 2FA after Phone Change with Oracle Mobile Authenticator

Tue, 2024-12-17 17:27

 I have been logging in to Oracle cloud using multi-factor authentication using 2FA with Oracle Mobile Authenticator and it was going fine until I had to change my phone. Both of my phones are android and I THOUGHT that I will simply migrate the apps and keep  using the accounts in my Oracle mobile authenticator same way, but it seems that after migration I lost all the accounts. 

Multi-Factor Authentication (MFA) is a security process that requires a user to provide two or more authentication factors to access a system, network, or application. Two-Factor Authentication (2FA) is a type of Multi-Factor Authentication that requires a user to provide two authentication factors:

  • Something you know (password, PIN)
  • Something you have (smartphone, token, or a one-time password sent via SMS or authenticator app)

So I was using 2FA with this Oracle Mobile Authenticator. I tried with my older codes , QR codes, the password, PIN and stuff but nothing worked. No matter, what I tried I simply couldn't log in to Oracle Cloud since the page asked me for a code generated by the authenticator.

Eventually following is the only way I could find to resolve this issue:

I talked in Oracle live chat, and they asked me to find an engineer to send me a bypass code.

If you don't know what Oracle Mobile Authenticator app is then as per docs:

Oracle Mobile Authenticator enables you to securely verify your identity by using your mobile device as a authentication factor. The app generates one-time passwords for login. Or it can receive notifications for login, which can be approved with a simple tap. When this authentication is used on top of username-password, it adds an additional layer of security that is essential for today's online applications.

Features:

  • Generate one-time passwords even when the device is offline
  • Push Notification based approval
  • App PIN for app protection
  • Set up via QR code, Config URL, or by entering key manually
  • Multiple account support
  • Generate OTP for other applications that make use of One-Time Password as per RFC 6238
  • Generate one-time passwords even when the device is offline
  • Push Notification based approval
  • App PIN for app protection


I hope this helps.

Categories: DBA Blogs

How-To Integrate ChatGPT with Oracle Digital Assistant

Tue, 2024-12-17 01:31

 Oracle Digital Assistant (ODA) provides a comprehensive platform for creating conversational interfaces. This article will guide you through integrating ChatGPT with ODA using the bots-node-sdk and openai libraries.

Prerequisites:

  • Oracle Digital Assistant instance
  • ChatGPT API key
  • Node.js environment

Configuration:

Create a new file named services.js and add the following code:

const OracleBot = require('@oracle/bots-node-sdk');
const { WebhookClient, WebhookEvent } = OracleBot.Middleware;
const express = require('express');
const { Configuration, OpenIApi } = require("openai");

const configuration = new Configuration({
  apiKey: "YOUR_CHATGPT_API_KEY",
});

const openai = new OpenIApi(configuration);

const textGeneration = async (prompt) => {
  try {
    const response = await openai.createCompletion({
      model: 'text-davinci-003',
      prompt: `Human: ${prompt}\nAI: `,
      temperature: 0.9,
      max_tokens: 500,
      top_p: 1,
      frequency_penalty: 0,
      presence_penalty: 0.6,
      stop: ['Human:', 'AI:'],
    });
    return {
      status: 1,
      response: `${response.data.choices[0].text}`,
    };
  } catch (error) {
    return {
      status: 0,
      response: '',
    };
  }
};

module.exports = (app) => {
  const logger = console;

  // Initialize Oracle Digital Assistant
  OracleBot.init(app, {
    logger,
  });

  // Set up webhook integration
  const webhook = new WebhookClient({
    channel: {
      url: "YOUR_ODA_WEBHOOK_URL",
      secret: "YOUR_ODA_WEBHOOK_SECRET",
    },
  });

  // Handle incoming messages
  webhook.on(WebhookEvent.MESSAGE_RECEIVED, (message) => {
    const action = message.queryResult.action;
    const queryText = message.queryResult.queryText;

    if (action === 'input.unknown') {
      textGeneration(queryText).then((result) => {
        if (result.status === 1) {
          res.send({
            fulfillmentMessages: [
              {
                text: {
                  text: [result.response],
                },
              },
            ],
          });
        } else {
          res.send({
            fulfillmentMessages: [
              {
                text: {
                  text: ["Sorry, I'm not able to help with that."],
                },
              },
            ],
          });
        }
      });
    } else {
      res.send({
        fulfillmentMessages: [
          {
            text: {
              text: [`No handler for action ${action}`],
            },
          },
        ],
      });
    }
  });

  // Set up endpoint for incoming messages
  app.post('/bot/message', (req, res) => {
    const message = req.body;
    webhook.send(message).then(() => res.send('ok'));
  });
};


  • Replace YOUR_CHATGPT_API_KEY with your actual ChatGPT API key.
  • Replace YOUR_ODA_WEBHOOK_URL and YOUR_ODA_WEBHOOK_SECRET with your actual Oracle Digital Assistant webhook URL and secret.

Dialog Flow Integration

To integrate the ChatGPT service with your Oracle Digital Assistant dialog flow, follow these steps:

Create a new intent in your dialog flow with the action input.unknown.

Add a new fulfillment to the intent with the following settings:

Fulfillment type: Webhook

Webhook URL: YOUR_APP_URL/bot/message (replace with your actual app URL)

HTTP method: POST

Save and deploy your dialog flow.

Testing

Test your integration by sending a message to your Oracle Digital Assistant instance. The message should be routed to the ChatGPT service, which will generate a response. The response will then be sent back to the user.

Note: Make sure to replace the placeholder values with your actual credentials and URLs.
Categories: DBA Blogs

Understanding Read-Only Options in Oracle: Instances vs. Databases

Mon, 2024-12-16 23:20

 When it comes to limiting data modifications in Oracle, two options are available: Read-Only Instances and Read-Only Databases. While both options restrict data changes, they serve different purposes and are used in distinct contexts.

Read-Only Instances:

A Read-Only Instance is a configuration in Oracle Real Application Clusters (RAC) where one or more instances are set to operate in read-only mode. This setup is ideal for environments with high concurrency for both read and write operations.

Key features of Read-Only Instances include:

  • Real-time query scaling by dedicating specific instances to read-only operations
  • Write operations are not allowed on designated read-only instances, but other instances can still handle writes
  • Useful for load balancing in RAC configurations
  • Read-Only Instances are suitable for offloading read-heavy workloads in RAC environments and supporting real-time analytics without impacting primary write performance.

Read-Only Databases

A Read-Only Database, on the other hand, is a database-wide mode that restricts all write operations. This setup is typically used for archiving, reporting, or maintenance tasks. 

Key features of Read-Only Databases include:

  • The entire database is locked for write operations
  • Used for archiving, reporting, or maintenance tasks
  • Can be achieved using the ALTER DATABASE OPEN READ ONLY command or a Data Guard physical standby database
  • Read-Only Databases are ideal for archiving purposes, maintenance periods, or using a standby database for reporting.

Choosing the Right Option:

When deciding between Read-Only Instances and Read-Only Databases, consider the following:

  • If you have a RAC environment and need to offload read-heavy workloads, Read-Only Instances might be the better choice.
  • If you need to restrict, write operations across the entire database, a Read-Only Database is the way to go.

Ultimately, understanding the differences between Read-Only Instances and Read-Only Databases will help you make informed decisions about managing your Oracle database.

Hope this helps. 

Categories: DBA Blogs

Resolving the "Invalid Characters" Error in Oracle Database 23ai Free Edition Installation

Thu, 2024-12-12 23:58

Oracle Database 23ai Free Edition offers a fully functional database for development, testing, and production purposes, allowing users to experience the powerful features of Oracle Database. However, users may encounter errors during the installation process, which can be frustrating and time-consuming to resolve. This article addresses a common issue that users may encounter during the installation of Oracle Database 23ai Free Edition and provides a solution to ensure a successful installation.

During a silent installation of Oracle Database 23ai Free Edition on Windows, the process terminates abruptly, and the setup.log file displays the following error message:

"SEVERE: The provided destination folder has invalid characters. Verify and try again."

The log file continues to grow in size, and the installation process must be manually terminated. This error can occur even when the destination folder path appears to be correct and free of any invalid characters.

Troubleshooting:

To resolve this issue, ensure that the following conditions are met:


1. Absolute Path for RSP File

Specify an absolute path for the RSP file in the command line. For example:

setup.exe /s /v"RSP_FILE=c:\myinstallpath\FREEInstall.rsp" /v"/L*v setup.log" /v"/qn"

This is necessary because the setup.exe file does not recognize the RSP file if only the filename is provided. By specifying the absolute path, you ensure that the setup.exe file can locate the RSP file correctly.


2. Empty Values in RSP File

Although the RSP file comment suggests that no parameter should be left with an empty value, it is safe to leave the DB_DOMAIN parameter empty if it is not required. This is because the DB_DOMAIN parameter is not mandatory, and leaving it empty does not affect the installation process.

Here is an example RSP file (FREEInstall.rsp) that can be used for a successful installation:


#Do not leave any parameter with empty value

#Install Directory location, username can be replaced with current user

INSTALLDIR=C:\app\myname\product\23ai\

#Database password, All users are set with this password, Remove the value once installation is complete

PASSWORD=mypassword

#If listener port is set to 0, available port will be allocated starting from 1521 automatically

LISTENER_PORT=0

#Specify char set of the database

CHAR_SET=AL32UTF8

#Specify the database domain for the db unique name specification

DB_DOMAIN=

#Specify TRUE for performing software only install

SOFTWARE_ONLY=FALSE

#Specify TRUE if installer should modify directory permissions when ACL is incorrect

MODIFY_DIRECTORY_PERMISSIONS=TRUE


By following the troubleshooting steps and using the example RSP file provided, you should be able to successfully install Oracle Database 23ai Free Edition on your Windows system. Remember to specify the absolute path for the RSP file and leave the DB_DOMAIN parameter empty if it is not required. If you encounter any further issues, refer to the Oracle Database documentation and support resources for assistance.

Categories: DBA Blogs

Control LLM's Output with Ollama Structured Outputs

Sun, 2024-12-08 00:14

 This video shows how to use Ollama to constrain the LLM output to a structured format locally.




Code:

pip install -U ollama

from ollama import chat
from pydantic import BaseModel

class Country(BaseModel):
  name: str
  capital: str
  languages: list[str]

response = chat(
  messages=[
    {
      'role': 'user',
      'content': 'Tell me about It.',
    }
  ],
  model='llama3.2',
  format=Country.model_json_schema(),
)

country = Country.model_validate_json(response.message.content)
print(country)

============

from ollama import chat
from pydantic import BaseModel

class Pet(BaseModel):
  name: str
  animal: str
  age: int
  color: str | None
  favorite_toy: str | None

class PetList(BaseModel):
  pets: list[Pet]

response = chat(
  messages=[
    {
      'role': 'user',
      'content': '''
        I have two pets.
        A cat named Luna who is 5 years old and loves playing with yarn. She has grey fur.
        I also have a 2 year old black cat named Loki who loves tennis balls.
      ''',
    }
  ],
  model='llama3.1',
  format=PetList.model_json_schema(),
)

pets = PetList.model_validate_json(response.message.content)
print(pets)

=============

from ollama import chat
from pydantic import BaseModel

class Object(BaseModel):
  name: str
  confidence: float
  attributes: str

class ImageDescription(BaseModel):
  summary: str
  objects: List[Object]
  scene: str
  colors: List[str]
  time_of_day: Literal['Morning', 'Afternoon', 'Evening', 'Night']
  setting: Literal['Indoor', 'Outdoor', 'Unknown']
  text_content: Optional[str] = None

path = 'path/to/image.jpg'

response = chat(
  model='llama3.2-vision',
  format=ImageDescription.model_json_schema(),  # Pass in the schema for the response
  messages=[
    {
      'role': 'user',
      'content': 'Analyze this image and describe what you see, including any objects, the scene, colors and any text you can detect.',
      'images': [path],
    },
  ],
  options={'temperature': 0},  # Set temperature to 0 for more deterministic output
)

image_description = ImageDescription.model_validate_json(response.message.content)
print(image_description)
Categories: DBA Blogs

Install Indic Parler-TTS model Locally

Tue, 2024-12-03 22:49

 This video shows how to locally install Indic Parler-TTS which can officially speak in 20 Indic languages.





Code:

conda create -n ai python=3.11 -y && conda activate ai

sudo apt-get install libportaudio2
conda install -c anaconda pyaudio

pip install torch torchaudio einops timm pillow
pip install git+https://github.com/huggingface/transformers
pip install git+https://github.com/huggingface/accelerate
pip install git+https://github.com/huggingface/diffusers
pip install huggingface_hub
pip install sentencepiece bitsandbytes protobuf decord
pip install librosa peft numpy

pip install git+https://github.com/huggingface/parler-tts.git


conda install -c conda-forge --override-channels notebook -y
conda install -c conda-forge --override-channels ipywidgets -y
jupyter notebook

import torch
from parler_tts import ParlerTTSForConditionalGeneration
from transformers import AutoTokenizer
import soundfile as sf

device = "cuda:0" if torch.cuda.is_available() else "cpu"

model = ParlerTTSForConditionalGeneration.from_pretrained("ai4bharat/indic-parler-tts").to(device)
tokenizer = AutoTokenizer.from_pretrained("ai4bharat/indic-parler-tts")
description_tokenizer = AutoTokenizer.from_pretrained(model.config.text_encoder._name_or_path)

prompt = "अरे, तुम आज कैसे हो?"
description = "A female speaker delivers a slightly expressive and animated speech with a moderate speed and pitch. The recording is of very high quality, with the speaker's voice sounding clear and very close up."

input_ids = description_tokenizer(description, return_tensors="pt").input_ids.to(device)
prompt_input_ids = tokenizer(prompt, return_tensors="pt").input_ids.to(device)

generation = model.generate(input_ids=input_ids, prompt_input_ids=prompt_input_ids)
audio_arr = generation.cpu().numpy().squeeze()
sf.write("indic_tts_out.wav", audio_arr, model.config.sampling_rate)
Categories: DBA Blogs

Pages