相关文章推荐
机灵的西瓜  ·  Hive 编程专题 之 - ...·  7 月前    · 
机灵的西瓜  ·  Unable to use ...·  9 月前    · 
机灵的西瓜  ·  java xml转json - ·  11 月前    · 
机灵的西瓜  ·  [NewtonSoft.Json] ...·  11 月前    · 
机灵的西瓜  ·  7.1 Verilog 显示任务 | ·  11 月前    · 
不爱学习的火腿肠  ·  java ...·  44 分钟前    · 
旅行中的铁链  ·  错误信息:SSL ShakeHand ...·  1小时前    · 

While using cx_oracle python library, it returns the below error:

error message: Cannot locate a 64-bit Oracle Client library: " libclntsh.so : cannot open shared object file: No such file or directory

The cx_oracle library is dependent on native libraries like libclntsh.so . There is an instant client required to use the cx_oracle library.

This often breaks due to the unavailability of the instant client or missing this library in the path variable.

To fix this issue, we can use the below init script to download the instant client and add that in the path variable. Sometimes the dynamic linker run-time cache needs to be refreshed, especially if there are native libraries that have cross dependencies, and they were placed on the machine before the LD_LIBRARY_PATH was set. In this case, run ldconfig -v /path/to/native/libs before attempting to load the native library from Java.

Here is a sample init script that would help

Step 1: Create the base directory you want to store the init script ( assuming it does not exist.) Here we use dbfs:/databricks/<directory> as an example.

dbutils.fs.mkdirs("dbfs:/databricks/<directory>/")

Step 2 : Create the script

 
dbutils.fs.put("dbfs:/databricks/oracleTest/oracle_ctl.sh","""
#!/bin/bash
#Download instant client archived file and update the url in case of different #version
wget --quiet-O /tmp/instantclient-basiclite-linux.x64-19.3.0.0.0dbru.zip https://download.oracle.com/otn_software/linux/instantclient/193000/instantclient-basiclite-linux.x6...
unzip /tmp/instantclient-basiclite-linux.x64-19.3.0.0.0dbru.zip -d /databricks/driver/oracle_ctl/
sudo echo 'export LD_LIBRARY_PATH="/databricks/driver/oracle_ctl/"' >> /databricks/spark/conf/spark-env.sh
sudo echo 'export ORACLE_HOME="/databricks/driver/oracle_ctl/"' >> /databricks/spark/conf/spark-env.sh
""", True)

Step 3: Verify that the script exists.

display( dbutils.fs.ls ("dbfs:/databricks/<directory>/oracle_ctl .sh "))

Step 4. Configure a cluster-scoped init script in the cluster

  • On the cluster configuration page, click the Advanced Options toggle.
  • At the bottom of the page, click the Init Scripts tab.
  • In the Destination drop-down, select DBFS, provide the file path to the script, and click Add.

Step 5: Restart the cluster

This did not work for me. I get an error as

DPI-1047: Cannot locate a 64-bit Oracle Client library: "/databricks/driver/oracle_ctl//lib/libclntsh.so: cannot open shared object file: No such file or directory".

When I tried to check for the directory using dbutils.f.ls(" /databricks/driver/oracle_ctl ") I am not able to find that directory. May be the init script is not copying the client as expected. So I also manually downloaded the Oracle client and mapped it to my cluster by creating a location as " /databricks/driver/oracle_ctl/", still no success.

Also I noticed that the error is pointing to a location ".... /oracle_ctl//lib/libclntsh ". When I inspected the downloaded client, I am not able to find any folder called /lib/libclntsh. May be its pointing to a wrong directory because of any recent changes?

Any helps is appreciated to connect to Oracle database in on premises system.

Hi @Manoj Ashvin​ can you use the below init script and try ?

dbutils.fs.put("dbfs:/databricks/oracleTest/oracle_ctl_new.sh","""
#!/bin/bash
sudo apt-get install libaio1
wget --quiet -O /tmp/instantclient-basiclite-linuxx64.zip https://download.oracle.com/otn_software/linux/instantclient/instantclient-basiclite-linuxx64.zip
unzip /tmp/instantclient-basiclite-linuxx64.zip -d /databricks/driver/oracle_ctl/
mv /databricks/driver/oracle_ctl/instantclient* /databricks/driver/oracle_ctl/instantclient
sudo echo 'export LD_LIBRARY_PATH="/databricks/driver/oracle_ctl/instantclient/"' >> /databricks/spark/conf/spark-env.sh
sudo echo 'export ORACLE_HOME="/databricks/driver/oracle_ctl/instantclient/"' >> /databricks/spark/conf/spark-env.sh 
""", True)

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.

Click here to register and join today!

Engage in exciting technical discussions , join a group with your peers and meet our Featured Members.

How to fix this runtime error in this Databricks distributed training tutorial workbook in Machine Learning Bad performance UDFs functions in Data Engineering speed issue DBR 13+ for R in Data Engineering IllegalArgumentException: requirement failed: Invalid uri in Data Engineering Errors using Dolly Deployed as a REST API in Machine Learning © Databricks 2023. All rights reserved. Apache, Apache Spark, Spark and the Spark logo are trademarks of the Apache Software Foundation.
  • Privacy Notice
  • Terms of Use
  • Your Privacy Choices
  • Your California Privacy Rights
  •  
    推荐文章