Guides → Upgrade to Release 6.0
Prior to upgrade, please review the Release Model. The Release Model describes the various release statuses: Preview and Generally Available. Refer to the Release Model to identify the release version and status that is most suitable for upgrading your Sandbox, Developer, User Acceptance Testing (UAT), or Production environment.
All hosts must have adequate disk space for shared storage to support the migration to a new shared storage directory structure. Lastly, if your Incorta cluster has a separate Apache Zookeeper cluster, you must also upgrade Apache Zookeeper.
If you are using Oracle as your metadata database, you need to migrate to MySQL. For more information, refer to Metadata Database Migration. Starting the 6.0.1 release, you can use your Oracle metadata database.
In the case of using external Notebooks, you must force reinstalling the Python library after upgrading to 6.0.
With the introduction of the new-generation loader, Incorta automatically detects inter-object dependencies within a load plan during the Planning phase of a load job. The Loader Service utilizes these detected dependencies and the user-defined load order within the schema to create an execution plan for loading objects. However, it’s important to note that using both the MVs' user-defined load order and automatically detected dependencies may result in an execution plan with cyclic dependencies, leading to load job failures. To avoid such failures, it is recommended to delete the MVs' user-defined load order before upgrading to the 6.0 release.
The Spark workers and MVs might fail with java.lang.UnsatisfiedLinkError
, to solve this issue do the following:
- Create a new
tmp
dir withexec
permission or use the<InstallationPath>/IncortaNode/spark/tmp
directory. - Add the following configurations to this file
<InstallationPath>/IncortaNode/spark/conf/spark-env.sh
:SPARK_WORKER_OPTS="-Djava.io.tmpdir=/dir/has/exec/permission"SPARK_LOCAL_DIRS=/dir/has/exec/permission - In the CMC > Server Configurations > Spark Integration > Extra options for Materialized views and notebooks, add the following options:
For more details, refer to Troubleshoot Spark > Spark applications failing after upgrade.spark.driver.extraJavaOptions=-Djava.io.tmpdir=/dir/has/exec/permission;spark.executor.extraJavaOptions=-Djava.io.tmpdir=/dir/has/exec/permission
There are four guides detailing how to upgrade to Release 6.0: