Feed aggregator

pgconf.eu 2025 – RECAP

Yann Neuhaus - Sun, 2025-10-26 13:30

I was fortunate to be able to attend at the pgconf.eu 2025.
This year event was happening in RIGA and joined together once again key members of the community, contributors, committers, sponsors and users from across the world.
I would summarize this year event with those three main topics : AI/LLM – PG18- Monitoring.

AI/LLMs


Compared to last year the formula changed a bit regarding the Community events day of Tuesday where for the first time different “Summits” where organized. If you want full details on the event and the schedule as well as the presentation slides of each talk you may find it here : Schedule — PostgreSQL Conference Europe 2025
I had the chance to be chosen as a speaker for the AI Summit. It was quite interesting for me. In total there was 13 short talks (10min) on various topics related to PostgreSQL and AI/LLMs it was dense with a lot of interesting ideas of implementations – you can find the details and slides here PGConf.EU 2025 PostgreSQL AI Summit – PostgreSQL wiki. AI/LLMs are the hot topic of the moment and naturally it came up often during this event, in the talks and in the discussions. You can find the pdf of my presentation here. I explained a business case implementation of a BI self-service agentic RAG to find relevant fields for a target KPI and data marts creation as output. Since the talks were short, it allowed to have a debate at the end between the audience and the speakers. The discussion nicely moderated by organizers was interesting because it exposed the same strong thoughts people have in general about AI/LLMs. A blend of distrust and not fully understanding of what it is about or how it could help organizations. Which, in itself, shows that the PostgreSQL community has the same difficulties at explaining technical challenges versus organizational/human challenges. My view here is that we don’t have technical challenges, they are almost un-relevant to most arguments but rather human relation and understanding of what values a DBA for example, brings to the organization. To me installing and configuring PostgreSQL has no benefits in terms of personal growth so automating it is quite natural and adding AI/LLMs on top is “nice to have” but not fundamentally different than an Ansible playbook. But for the junior DBA this an additional abstraction that can be dangerous because it provides tools that users can’t grasp the full extent of their consequences. This outlines that the main issue of integrating AI/LLMs workflows is more a governance/ C-management issue than a technical one and it can’t be the last excuse for adding to the technological debt.
Jay Miller from Aiven explained how you can fail at exposing PII from LLMs and MCPs. This is rely a relevant topic knowing that more and more organization are facing issues like shadow IT. He also was quite the show host and was funny to hear. I recommend strongly watching the recording when it will be released.

PG18

This year was just after the PostgreSQL 18 version release which is one the version that brought major improvements and is initiating changes for future release to come. I was quite enthusiast to listen to Melanie Plagemen on how she worked on the improvements on freezing in this release. I have to say, usually when I am going at an advanced internal talk, I am more confused after than before. But here, Melanie did an amazing job at talking about a technical complex topic without loosing the audience.
Gülçin Yıldırım Jelínek, on her side explained what’s new in PG18 about constraints like NOT ENFORCED and NOT NULL and how to use them. The COO of Cybertec Raj Verma, during a sponsor talk, explained why compliance matters and how to minimize the risks and how PostgreSQL is helping us to be PCI DSS, GDPR, nLPD or HIPAA compliant.
Another interesting talk I was happy to attend was from Floor Drees and Gabriele Bartolini. they explain how they went on joining the CloudNativePG project to the CNCF.

Monitoring

This leads me to another important topic, I wasn’t looking for it but became a bit of a main subject for my over the years as a DBA that was interested in performance tuning. Monitoring on PostgreSQL was introduced by several talks like Luigi Nardi and his idea of workload fingerprint with the DBtune tool they have. Additionally, Lukas Fittl presented pg_stat_plans, an extension which aims at tracking execution plans over time. This is definitely something I am going to try and will push for implementation in the core extensions if not the core code itself.
The reason for that is obvious for me, PostgreSQL is becoming more and more central to enterprise organizations and appart from subject like TDE, monitoring is going to become a key aspect of automation, CloudNativePG and AI/LLM workflows. Having PostgreSQL being able to be monitored better and easier at the core will allow leveraging at all this levels. Cloud companies release that already hence there involvement in similar projects.

In the end, this year was once again the occasion for me to think about many relevant topics and exchange with PostgreSQL hackers as well as users from around the world. I came back home with the head full of ideas to investigate.

Additionally after the conference the videos of the each talks will be uploaded to the pgconf Europe Youtube channel : PostgreSQL Europe, but you can already check previous amazing talks and this year pgday Paris.

So once again the PostgreSQL flag was floating up high !

L’article pgconf.eu 2025 – RECAP est apparu en premier sur dbi Blog.

Alfresco – When a JDBC driver breaks after years of stability

Yann Neuhaus - Sun, 2025-10-26 05:06

A customer recently contacted us regarding an issue with an old Alfresco Enterprise 4.2.2 environment. The system had been running stably for years, with a nightly restart of the Tomcat/Alfresco service to help recycle the memory and fix long-standing issues in this version. Earlier this week, one of these nightly restarts suddenly failed and since then, Alfresco would completely refuse to start. Unfortunately, that was on Production, so it was rather important! Yes, it is a very old version of Alfresco (on an old OS as well, obviously), but the customer doesn’t want to upgrade or patch it, so it was left as is since its installation by another company that isn’t around anymore.

The error

When reviewing the Tomcat logs, the startup sequence looked normal until Alfresco attempted to initialize Hibernate and connect to its MySQL database. Here is the relevant part of the log with the error:

oct. 22, 2025 5:45:37 AM org.apache.catalina.core.AprLifecycleListener lifecycleEvent
INFOS: Loaded APR based Apache Tomcat Native library 1.1.33 using APR version 1.4.6.
oct. 22, 2025 5:45:37 AM org.apache.catalina.core.AprLifecycleListener lifecycleEvent
INFOS: APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true].
oct. 22, 2025 5:45:37 AM org.apache.catalina.core.AprLifecycleListener initializeSSL
INFOS: OpenSSL successfully initialized (OpenSSL 0.9.8y 5 Feb 2013)
oct. 22, 2025 5:45:37 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["http-apr-8080"]
oct. 22, 2025 5:45:37 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["ajp-apr-8009"]
oct. 22, 2025 5:45:37 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["http-bio-8443"]
oct. 22, 2025 5:45:37 AM org.apache.catalina.startup.Catalina load
INFOS: Initialization processed in 995 ms
...
2025-10-22 05:46:15,118  INFO  [alfresco.repo.admin] [localhost-startStop-1] Using database URL 'jdbc:mysql://localhost:3306/alfresco?autoReconnect=true' with user 'alfresco'.
2025-10-22 05:46:15,491  INFO  [alfresco.repo.admin] [localhost-startStop-1] Connected to database MySQL version 5.6.28-log
2025-10-22 05:46:21,836  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Starting 'sysAdmin' subsystem, ID: [sysAdmin, default]
2025-10-22 05:46:21,863  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Startup of 'sysAdmin' subsystem, ID: [sysAdmin, default] complete
2025-10-22 05:46:21,902  INFO  [domain.schema.SchemaBootstrap] [localhost-startStop-1] Ignoring script patch (post-Hibernate): patch.db-V4.2-metadata-query-indexes
2025-10-22 05:46:31,210  WARN  [hibernate.cfg.SettingsFactory] [localhost-startStop-1] Could not obtain connection metadata
org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (java.lang.NullPointerException)
        at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
        at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
        at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
        at org.springframework.orm.hibernate3.LocalDataSourceConnectionProvider.getConnection(LocalDataSourceConnectionProvider.java:81)
        at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:84)
        at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2079)
        at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1304)
        at org.springframework.orm.hibernate3.LocalSessionFactoryBean.newSessionFactory(LocalSessionFactoryBean.java:860)
        at org.springframework.orm.hibernate3.LocalSessionFactoryBean.buildSessionFactory(LocalSessionFactoryBean.java:779)
        at org.springframework.orm.hibernate3.AbstractSessionFactoryBean.afterPropertiesSet(AbstractSessionFactoryBean.java:211)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1477)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1417)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:563)
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
        at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:276)
        at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:197)
        at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
        at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:63)
        at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5068)
        at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5584)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652)
        at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1092)
        at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1984)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: java.lang.NullPointerException
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:982)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927)
        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:1023)
        at com.mysql.jdbc.ConnectionImpl.initializePropsFromServer(ConnectionImpl.java:3451)
        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2443)
        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2213)
        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:797)
        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:389)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:305)
        at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
        at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
        at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
        at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
        ... 37 more
Caused by: java.lang.NullPointerException
        at java.util.TreeMap.put(TreeMap.java:556)
        at com.mysql.jdbc.Util.resultSetToMap(Util.java:506)
        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:964)
        ... 53 more
2025-10-22 05:46:31,597  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Stopping 'sysAdmin' subsystem, ID: [sysAdmin, default]
2025-10-22 05:46:31,597  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Stopped 'sysAdmin' subsystem, ID: [sysAdmin, default]
2025-10-22 05:46:31,607  ERROR [web.context.ContextLoader] [localhost-startStop-1] Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialectResourceLoader' defined in class path resource [alfresco/extension/context/typo3-context.xml]: Cannot create inner bean 'org.springframework.beans.factory.config.PropertyPathFactoryBean#70397d85' of type [org.springframework.beans.factory.config.PropertyPathFactoryBean] while setting bean property 'dialectClass'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.beans.factory.config.PropertyPathFactoryBean#70397d85' defined in class path resource [alfresco/extension/context/typo3-context.xml]: Initialization of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialect': FactoryBean threw exception on object creation; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:120)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1325)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1086)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:580)
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
        at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:276)
        at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:197)
        at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
        at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:63)
        at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5068)
        at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5584)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652)
        at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1092)
        at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1984)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.beans.factory.config.PropertyPathFactoryBean#70397d85' defined in class path resource [alfresco/extension/context/typo3-context.xml]: Initialization of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialect': FactoryBean threw exception on object creation; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:527)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:270)
        ... 29 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialect': FactoryBean threw exception on object creation; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection
        at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:149)
        at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:102)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1429)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:245)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
        at org.springframework.beans.factory.config.PropertyPathFactoryBean.setBeanFactory(PropertyPathFactoryBean.java:186)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeAwareMethods(AbstractAutowireCapableBeanFactory.java:1439)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1408)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        ... 31 more
Caused by: org.hibernate.exception.GenericJDBCException: Cannot open connection
        at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:103)
        at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:91)
        at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43)
        at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:29)
        at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:426)
        at org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:144)
        at org.hibernate.jdbc.BorrowedConnectionProxy.invoke(BorrowedConnectionProxy.java:150)
        at com.sun.proxy.$Proxy5.setAutoCommit(Unknown Source)
        at org.alfresco.hibernate.DialectFactoryBean.getObject(DialectFactoryBean.java:67)
        at org.alfresco.hibernate.DialectFactoryBean.getObject(DialectFactoryBean.java:39)
        at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:142)
        ... 39 more
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (java.lang.NullPointerException)
        at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
        at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
        at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
        at org.springframework.orm.hibernate3.LocalDataSourceConnectionProvider.getConnection(LocalDataSourceConnectionProvider.java:81)
        at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:423)
        ... 45 more
Caused by: java.sql.SQLException: java.lang.NullPointerException
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:982)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927)
        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:1023)
        at com.mysql.jdbc.ConnectionImpl.initializePropsFromServer(ConnectionImpl.java:3451)
        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2443)
        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2213)
        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:797)
        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
        at sun.reflect.GeneratedConstructorAccessor103.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:389)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:305)
        at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
        at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
        at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
        at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
        ... 49 more
Caused by: java.lang.NullPointerException
        at java.util.TreeMap.put(TreeMap.java:556)
        at com.mysql.jdbc.Util.resultSetToMap(Util.java:506)
        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:964)
        ... 64 more
oct. 22, 2025 5:46:31 AM org.apache.catalina.core.StandardContext listenerStart
GRAVE: Exception lors de l'envoi de l'évènement contexte initialisé (context initialized) à l'instance de classe d'écoute (listener) org.alfresco.web.app.ContextLoaderListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialectResourceLoader' defined in class path resource [alfresco/extension/context/typo3-context.xml]: Cannot create inner bean 'org.springframework.beans.factory.config.PropertyPathFactoryBean#70397d85' of type [org.springframework.beans.factory.config.PropertyPathFactoryBean] while setting bean property 'dialectClass'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.beans.factory.config.PropertyPathFactoryBean#70397d85' defined in class path resource [alfresco/extension/context/typo3-context.xml]: Initialization of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'extranetDialect': FactoryBean threw exception on object creation; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:281)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:120)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1325)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1086)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:517)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
        ...

The key clue here appears to be the NullPointerException from the buildCollationMapping(), which probably indicates a problem during the JDBC driver’s initialization of MySQL collation data.

DB & JDBC Connector Details

Given that the error appeared to be linked to the DB, I checked that it was properly running and responding, while also retrieving its version:

mysql> SELECT VERSION();
+------------+
| VERSION()  |
+------------+
| 5.6.28-log |
+------------+

1 row in set (0.01 sec)

Everything looked fine, and the collation list was consistent with no apparent missing data and no “null” values:

mysql> SHOW COLLATION;
+---------------------+----------+-----+---------+----------+---------+
| Collation           | Charset  | Id  | Default | Compiled | Sortlen |
+---------------------+----------+-----+---------+----------+---------+
| big5_chinese_ci     | big5     |   1 | Yes     | Yes      |       1 |
| big5_bin            | big5     |  84 |         | Yes      |       1 |
| dec8_swedish_ci     | dec8     |   3 | Yes     | Yes      |       1 |
| dec8_bin            | dec8     |  69 |         | Yes      |       1 |
...

Another important thing to check is the JDBC Connector used by Tomcat to connect to the database (as it’s not using the mysql binaries to connect, as I did above):

[alfresco@alf01 ~]$ cd $CATALINA_HOME
[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ ls -l lib/mysql-*
-rw-r--r--. 1 alfresco alfresco  802721 Jan 21  2015 lib/mysql-connector-java-5.1.20-bin.jar
[alfresco@alf01 tomcat]$

The MySQL JDBC Connector version 5.1.20 was released in May 2012. Looking into known MySQL Connector/J bugs, it looked like there were several problems affecting it, like long-running MySQL instances, some related to the buildCollationMapping, etc… The MySQL DB of the environment was running for the past 2.5 years without any restart. Therefore, the first thing we suggested was simply to restart the DB. Unfortunately, it didn’t help.

Patching the JDBC Connector

Since patching/upgrading Alfresco and the DB was out of the question, the only easy thing we could try, as a next step, was to patch the JDBC Connector. Looking at the MySQL download website for that component (c.f. here), the latest minor version available is 5.1.49, released in April 2020. Therefore, I tried to download and extract it and then I stopped the (not-really) running Tomcat process:

[alfresco@alf01 tomcat]$ wget "https://cdn.mysql.com/archives/mysql-connector-java-5.1/mysql-connector-java-5.1.49.zip"
--2025-10-22 09:43:47--  https://cdn.mysql.com/archives/mysql-connector-java-5.1/mysql-connector-java-5.1.49.zip
Resolving cdn.mysql.com... 2.19.75.100, 2001:918:ffcb:1a6::1d68, 2001:918:ffcb:181::1d68
Connecting to cdn.mysql.com|2.19.75.100|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 3722067 (3.5M) [application/zip]
Saving to: “mysql-connector-java-5.1.49.zip”

100%[============================================================>] 3,722,067   19.1M/s   in 0.2s

2025-10-22 09:43:48 (19.1 MB/s) - “mysql-connector-java-5.1.49.zip” saved [3722067/3722067]

[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ unzip mysql-connector-java-5.1.49.zip
Archive:  mysql-connector-java-5.1.49.zip
   creating: mysql-connector-java-5.1.49/
   creating: mysql-connector-java-5.1.49/src/
   creating: mysql-connector-java-5.1.49/src/com/
   ...
   inflating: mysql-connector-java-5.1.49/src/testsuite/ssl-test-certs/server-cert.pem
   inflating: mysql-connector-java-5.1.49/src/testsuite/ssl-test-certs/server-key.pem
[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ ../alfresco.sh stop tomcat
  **
  ** Alfresco has been stopped successfully
  **
[alfresco@alf01 tomcat]$

The final step was to backup the current and new versions of the JARs. Then I replaced the file in the Tomcat lib folder before restarting the process:

[alfresco@alf01 tomcat]$ ls -l lib/mysql-*
-rw-r--r--. 1 alfresco alfresco  802721 Jan 21  2015 lib/mysql-connector-java-5.1.20-bin.jar
[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ mv lib/mysql-connector-java-5.1.20-bin.jar ./
[alfresco@alf01 tomcat]$ mv mysql-connector-java-5.1.49/mysql-connector-java-5.1.49.jar ./
[alfresco@alf01 tomcat]$ cp -p mysql-connector-java-5.1.49.jar lib/
[alfresco@alf01 tomcat]$ rm -rf mysql-connector-java-5.1.49.zip mysql-connector-java-5.1.49
[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ ls -l lib/mysql-*
-rw-r--r--. 1 alfresco alfresco 1006904 Apr 20  2020 lib/mysql-connector-java-5.1.49.jar
[alfresco@alf01 tomcat]$
[alfresco@alf01 tomcat]$ ../alfresco.sh start tomcat
  **
  ** Alfresco has been started
  **
[alfresco@alf01 tomcat]$

Once done, the Tomcat logs looked much better, there were no errors related to database connection anymore. Alfresco was able to finish its startup and users were able to log in to Alfresco Share again:

oct. 22, 2025 8:27:35 AM org.apache.catalina.core.AprLifecycleListener lifecycleEvent
INFOS: Loaded APR based Apache Tomcat Native library 1.1.33 using APR version 1.4.6.
oct. 22, 2025 8:27:35 AM org.apache.catalina.core.AprLifecycleListener lifecycleEvent
INFOS: APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true].
oct. 22, 2025 8:27:35 AM org.apache.catalina.core.AprLifecycleListener initializeSSL
INFOS: OpenSSL successfully initialized (OpenSSL 0.9.8y 5 Feb 2013)
oct. 22, 2025 8:27:35 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["http-apr-8080"]
oct. 22, 2025 8:27:35 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["ajp-apr-8009"]
oct. 22, 2025 8:27:35 AM org.apache.coyote.AbstractProtocol init
INFOS: Initializing ProtocolHandler ["http-bio-8443"]
oct. 22, 2025 8:27:35 AM org.apache.catalina.startup.Catalina load
INFOS: Initialization processed in 989 ms
...
2025-10-22 08:28:16,988  INFO  [alfresco.repo.admin] [localhost-startStop-1] Using database URL 'jdbc:mysql://localhost:3306/alfresco?autoReconnect=true' with user 'alfresco'.
2025-10-22 08:28:17,300  INFO  [alfresco.repo.admin] [localhost-startStop-1] Connected to database MySQL version 5.6.28-log
2025-10-22 08:28:23,797  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Starting 'sysAdmin' subsystem, ID: [sysAdmin, default]
2025-10-22 08:28:23,822  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Startup of 'sysAdmin' subsystem, ID: [sysAdmin, default] complete
2025-10-22 08:28:23,859  INFO  [domain.schema.SchemaBootstrap] [localhost-startStop-1] Ignoring script patch (post-Hibernate): patch.db-V4.2-metadata-query-indexes
Wed Oct 22 08:28:33 CEST 2025 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Oct 22 08:28:33 CEST 2025 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.

SETTING APPLICATION CONTEXT

2025-10-22 08:28:33,976  INFO  [extensions.webscripts.TemplateProcessorRegistry] [localhost-startStop-1] Registered template processor Repository Template Processor for extension ftl
2025-10-22 08:28:33,982  INFO  [extensions.webscripts.ScriptProcessorRegistry] [localhost-startStop-1] Registered script processor Repository Script Processor for extension js
2025-10-22 08:28:40,396  INFO  [domain.schema.SchemaBootstrap] [localhost-startStop-1] Connecting to database: jdbc:mysql://localhost:3306/alfresco?autoReconnect=true, UserName=alfresco@localhost, MySQL Connector Java
2025-10-22 08:28:40,396  INFO  [domain.schema.SchemaBootstrap] [localhost-startStop-1] Schema managed by database dialect org.hibernate.dialect.MySQLInnoDBDialect.
2025-10-22 08:28:41,104  INFO  [domain.schema.SchemaBootstrap] [localhost-startStop-1] No changes were made to the schema.
2025-10-22 08:28:42,093  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Starting 'Authentication' subsystem, ID: [Authentication, managed, alfinst]
2025-10-22 08:28:42,545  INFO  [management.subsystems.ChildApplicationContextFactory] [localhost-startStop-1] Startup of 'Authentication' subsystem, ID: [Authentication, managed, alfinst] complete
...
oct. 22, 2025 8:30:09 AM org.apache.coyote.AbstractProtocol start
INFOS: Starting ProtocolHandler ["http-apr-8080"]
oct. 22, 2025 8:30:09 AM org.apache.coyote.AbstractProtocol start
INFOS: Starting ProtocolHandler ["ajp-apr-8009"]
oct. 22, 2025 8:30:09 AM org.apache.coyote.AbstractProtocol start
INFOS: Starting ProtocolHandler ["http-bio-8443"]
oct. 22, 2025 8:30:09 AM org.apache.catalina.startup.Catalina start
INFOS: Server startup in 153447 ms
2025-10-22 08:30:10,121  INFO  [web.site.EditionInterceptor] [ajp-apr-8009-exec-7] Successfully retrieved license information from Alfresco.
2025-10-22 08:30:39,780  INFO  [web.scripts.ImapServerStatus] [ajp-apr-8009-exec-3] Successfully retrieved IMAP server status from Alfresco: disabled
2025-10-22 08:30:40,107  INFO  [web.scripts.SyncModeConfig] [ajp-apr-8009-exec-3] Successfully retrieved Sync Mode configuration from Alfresco: ON_PREMISE

This issue was pretty interesting, as it shows that even if you do not change anything on a system running for more than 10 years (same DB/Alfresco/Tomcat/Java/JDBC Connector versions), and that restarts daily, you can still face bugs at any time… I can’t check all details of the bugs while I’m writing this blog as the MySQL bugs website is horribly slow and crashing today, but clearly this shows that even if you’re maintaining old platforms that can’t be upgraded, it’s still worth keeping small components like JDBC connectors up to date.

L’article Alfresco – When a JDBC driver breaks after years of stability est apparu en premier sur dbi Blog.

Dctm – D2-DAR installation fails because of workflow trackers

Yann Neuhaus - Sat, 2025-10-25 02:11

In a Documentum upgrade project from 20.x to 23.4 earlier this year, I faced an issue which prevented the installation/upgrade of D2. The problem was located inside the D2-DAR while it was trying to update the workflow trackers, which failed because of an ORA-01427 error. The exact error displayed was this one:

[emc.installer] dmbasic output : [DM_SERVER_E_EXEC_SQL]error:  "The execsql method failed due to a database error: ORA-01427: single-row subquery returns more than one row"

Before that happened, I successfully upgraded 7 or 8 environments from 20.x to 23.4, for that customer, and after that, I did around 20 more. However, this issue only happened once, so not sure what exactly was wrong, and the Application/Business Teams couldn’t find what really caused the initial issue (inside the DB) either… In any cases, to start my investigation, I tried to quickly compare the generated logs for the D2-DAR installation on a working vs non-working Repository.

Working Repository logs:

###
## Working repo:
###
[dmadmin@cs-0 DARsInternal]$ cat dar-deploy-D2-DAR.dar-REPO1-20250703-133024.log
[INFO]  ******************************************************
[INFO]  * Headless Composer
[INFO]  * Version:        23.4.0000.0120
[INFO]  * Java version:   17.0.13 (64bit)
[INFO]  * Java home:      $JAVA_HOME
[INFO]  * Set storage type: false
[INFO]  *
[INFO]  * DAR file:       $DM_HOME/install/DARsInternal/D2-DAR.dar
[INFO]  * Project name:   D2-DAR
[INFO]  * Built by Composer: 23.4.0000.0027
[INFO]  *
...
[INFO]  dmbasic output : Finished creating required indexes on D2 types.
[INFO]  dmbasic output : Copied workflow_id in to d2c_workflow_tracker object
[INFO]  dmbasic output : Copied process_id in to d2c_workflow_tracker object
[INFO]  dmbasic output : Copied supervisor_name in to d2c_workflow_tracker object
[INFO]  dmbasic output : Adding d2_digital_sign_tracker_user user to the group d2_digital_signature_tracker
[INFO]  dmbasic output : Fetched the group d2_digital_signature_tracker
[INFO]  dmbasic output : Check for 'Public searches' folder to migrate.
[INFO]  dmbasic output : Folder '/Resources/D2/[MIGRATE] Public searches' not found. Do not migrate.
[INFO]  dmbasic output : Disconnected from the server.
[INFO]  Finished executing post-install script Thu Jul 03 13:32:13 UTC 2025
[INFO]  Project 'D2-DAR' was successfully installed.
[dmadmin@cs-0 DARsInternal]$

Non-working Repository logs:

###
## Non working repo:
###
[dmadmin@cs-0 DARsInternal]$ cat dar-deploy-D2-DAR.dar-REPO2-20250704-144219.log
[INFO]  ******************************************************
[INFO]  * Headless Composer
[INFO]  * Version:        23.4.0000.0120
[INFO]  * Java version:   17.0.13 (64bit)
[INFO]  * Java home:      $JAVA_HOME
[INFO]  * Set storage type: false
[INFO]  *
[INFO]  * DAR file:       $DM_HOME/install/DARsInternal/D2-DAR.dar
[INFO]  * Project name:   D2-DAR
[INFO]  * Built by Composer: 23.4.0000.0027
[INFO]  *
...
[INFO]  dmbasic output : Finished creating required indexes on D2 types.
[INFO]  dmbasic output : Copied workflow_id in to d2c_workflow_tracker object
[INFO]  dmbasic output : [DM_SERVER_E_EXEC_SQL]error:  "The execsql method failed due to a database error: ORA-01427: single-row subquery returns more than one row"
[INFO]  dmbasic output :
[INFO]  dmbasic output : [DM_TYPE_MGR_E_INDEX_ALREADY_EXISTS]error:  "The index being created on type d2_recently_used_profile already exists. The existing index is represented by dmi_index object with id 1f012345800005a0."
[INFO]  dmbasic output :
[INFO]  dmbasic output : [DM_TYPE_MGR_E_INDEX_ALREADY_EXISTS]error:  "The index being created on type d2_recently_used_profile already exists. The existing index is represented by dmi_index object with id 1f0123458000059f."
[INFO]  dmbasic output :
[INFO]  dmbasic output :
[INFO]  dmbasic output : Exiting with dmExit(-1)
[ERROR]  Procedure execution failed with dmbasic exit value : 255
[INFO]  D2-DAR install failed.
[ERROR]  Unable to install dar file $DM_HOME/install/DARsInternal/D2-DAR.dar
com.emc.ide.installer.PostInstallException: Error running post-install procedure "PostInstall". Please contact the procedure owner to verify if it is functioning properly.
        at internal.com.emc.ide.installer.DarInstaller.postInstall(DarInstaller.java:1574)
        at internal.com.emc.ide.installer.DarInstaller.doInstall(DarInstaller.java:669)
        ...
Caused by: com.emc.ide.external.dfc.procedurerunner.ProcedureRunnerException: Procedure execution failed with dmbasic exit value : 255
        at com.emc.ide.external.dfc.procedurerunner.ProcedureRunnerUtils.executeDmBasic(ProcedureRunnerUtils.java:285)
        at com.emc.ide.external.dfc.procedurerunner.ProcedureRunner.execute(ProcedureRunner.java:55)
        ...
[ERROR]  Failed to install DAR
Unable to install dar file $DM_HOME/install/DARsInternal/D2-DAR.dar
        at com.emc.ant.installer.api.InstallerAntTask.installDar(InstallerAntTask.java:273)
        at com.emc.ant.installer.api.InstallerAntTask.execute(InstallerAntTask.java:135)
        ...
Caused by: com.emc.ide.installer.PostInstallException: Error running post-install procedure "PostInstall". Please contact the procedure owner to verify if it is functioning properly.
        at internal.com.emc.ide.installer.DarInstaller.postInstall(DarInstaller.java:1574)
        at internal.com.emc.ide.installer.DarInstaller.doInstall(DarInstaller.java:669)
        ...
Caused by: com.emc.ide.external.dfc.procedurerunner.ProcedureRunnerException: Procedure execution failed with dmbasic exit value : 255
        at com.emc.ide.external.dfc.procedurerunner.ProcedureRunnerUtils.executeDmBasic(ProcedureRunnerUtils.java:285)
        at com.emc.ide.external.dfc.procedurerunner.ProcedureRunner.execute(ProcedureRunner.java:55)
        at internal.com.emc.ide.installer.DarInstaller.postInstall(DarInstaller.java:1570)
        ... 42 more

As often with DAR installation failures, there aren’t a lot of information. It’s clear that the installation failed because there is apparently some wrong data inside the DB (a query that should return only 1 row is giving more results), but there isn’t much more details expect that. By comparing the 2 different logs, you can understand that the issue is located between these 2 lines:

[INFO]  dmbasic output : Copied workflow_id in to d2c_workflow_tracker object
[INFO]  dmbasic output : Copied process_id in to d2c_workflow_tracker object

Therefore, finding the root cause is essentially finding what is being done between these 2 lines. A DAR is essentially a zip (Documentum Archive), so you can just extract it and look at its content to see what is being done. In this case (D2 23.4), I could find the necessary source code inside the file named “D2-DAR/bin/content/55/-1569930955/runnableContent.crtext“. Here is an extract of that file:

    Print "Copied workflow_id in to d2c_workflow_tracker object"

    sQuery = "update d2c_workflow_tracker_s set process_id = " & _
             "(select child_id from dm_relation_s where dm_relation_s.parent_id = " & _
             "d2c_workflow_tracker_s.r_object_id and dm_relation_s.relation_name = " & _
             "'D2_WF_TRACKER_TEMPLATE') where (process_id IS NULL or process_id = '0000000000000000') " & _
             " and exists (select child_id from dm_relation_s where dm_relation_s.parent_id = " & _
             "d2c_workflow_tracker_s.r_object_id  and dm_relation_s.relation_name = 'D2_WF_TRACKER_TEMPLATE')"

    bRet = dmAPIExec("execsql," & sess & "," & sQuery)

    if (not bRet) then
      Call Err_Handle(sess, "0")
      Print "Exiting with dmExit(-1)"
      dmExit(-1)
    end if

    Print "Copied process_id in to d2c_workflow_tracker object"

Therefore, the query that is causing the issue is the one referenced above as “sQuery”, i.e. this one, transformed into plain SQL:

update d2c_workflow_tracker_s 
  set process_id=(select child_id from dm_relation_s 
    where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
      and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE') 
  where (process_id IS NULL or process_id='0000000000000000') 
    and exists (select child_id from dm_relation_s 
      where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
        and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE');

And indeed, executing this query produces the same result:

SQL> update d2c_workflow_tracker_s 
       set process_id=(select child_id from dm_relation_s 
         where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
           and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE') 
       where (process_id IS NULL or process_id='0000000000000000') 
         and exists (select child_id from dm_relation_s 
           where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
             and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE');
update d2c_workflow_tracker_s set process_id=(select child_id from dm_relation_s where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE') where (process_id IS NULL or process_id='0000000000000000') and exists (select child_id from dm_relation_s where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE')
                                                *
ERROR at line 1:
ORA-01427: single-row subquery returns more than one row

Unfortunately, because of the way the query is formed (mapping all d2c_workflow_tracker_s.process_id 1-by-1 from the dm_relation_s.child_id), you cannot just execute the sub-query (whatever is present between the parenthesis):

SQL> select child_id from dm_relation_s 
       where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
         and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE';
select child_id from dm_relation_s where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE'
                                                                   *
ERROR at line 1:
ORA-00904: "D2C_WORKFLOW_TRACKER_S"."R_OBJECT_ID": invalid identifier

Even if you “correct” it (by adding the 2nd table name in the “from”), then it will just display something that isn’t directly usable:

SQL> select child_id from dm_relation_s, d2c_workflow_tracker_s 
       where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
         and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE';

CHILD_ID
----------------
4b01234580044d0f
4b01234580057910
4b01234580002572
...
4b01234580057910
4b01234580057910
4b01234580311d91

1406 rows selected.

What to do then? Well, you need to go further. There are 1406 rows from the above query (number of dm_process being tracked or to be more accurate, the number of relations between a Workflow Tracker (parent_id) and its dm_process (child_id)). Assuming that the error is legit, then that would probably mean that there could be less Workflow Trackers than that number. Therefore, after a bit of reflection, I executed the following query to try to find all Workflow Trackers and the count of relations that each and every one of them has. Under normal conditions, each Workflow Tracker should have a single relation (meaning a single dm_process (1-by-1 mapping)) but that wasn’t the case:

SQL> select d2c_workflow_tracker_s.r_object_id, count(dm_relation_s.parent_id) 
       from d2c_workflow_tracker_s, dm_relation_s 
       where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
         and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE' 
       group by d2c_workflow_tracker_s.r_object_id 
       order by 2;

R_OBJECT_ID      COUNT(DM_RELATION_S.PARENT_ID)
---------------- ------------------------------
09012345800df7eb                              1
09012345800df7ed                              1
...
0901234580313e31                              1
090123458004fdde                              2
090123458005cdf5                              2
090123458005fb44                              2

1403 rows selected.

There are only 1403 Workflow Trackers, which means that there are 3 “additional” relations (or 3 dm_process which are assigned to an already-used Workflow Tracker). To get more details on the Workflow and their Workflow Trackers, these kinds of queries can be executed:

SQL> select parent_id, child_id from dm_relation_s 
       where parent_id IN ('090123458004fdde','090123458005cdf5','090123458005fb44') 
         and relation_name='D2_WF_TRACKER_TEMPLATE' 
       order by 1, 2;

PARENT_ID        CHILD_ID
---------------- ----------------
090123458004fdde 4b01234580002575
090123458004fdde 4b01234580057910
090123458005cdf5 4b01234580057910
090123458005cdf5 4b01234580057910
090123458005fb44 4b01234580057910
090123458005fb44 4b01234580057910



SQL> select r_object_id, object_name, r_creator_name from dm_sysobject_sp 
       where r_object_id IN ('090123458004fdde','090123458005cdf5','090123458005fb44');

R_OBJECT_ID      OBJECT_NAME                                                            R_CREATOR_NAME
---------------- ---------------------------------------------------------------------- -----------------
090123458004fdde Start Import Approval October 29, 2021 at 10:34:45 AM UTC              OQSOP01
090123458005cdf5 Start Ready For Approval November 11, 2021 at 3:55:04 PM UTC           OQAuthor01
090123458005fb44 Start In Review November 16, 2021 at 12:01:32 PM UTC                   OQSOP01



SQL> select r_object_id, object_name from dm_sysobject_sp 
       where r_object_id IN ('4b01234580002575','4b01234580057910');

R_OBJECT_ID      OBJECT_NAME
---------------- ----------------------
4b01234580002575 WF_Import
4b01234580057910 WF_Review_Approve



SQL> select * from dm_relation_sp 
       where parent_id='090123458005cdf5' 
         and relation_name='D2_WF_TRACKER_TEMPLATE';

R_OBJECT_ID      RELATION_NAME                    PARENT_ID        CHILD_ID         CHILD_LABEL                      PERMANENT_LINK   ORDER_NO EFFECTIVE EXPIRATIO DESCRIPTION                 I_PARTITION I_IS_REPLICA   I_VSTAMP
---------------- -------------------------------- ---------------- ---------------- -------------------------------- -------------- ---------- --------- --------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- ------------ ----------
370123458000d92a D2_WF_TRACKER_TEMPLATE           090123458005cdf5 4b01234580057910                                               0          0 01-JAN-01 01-JAN-01                               0             0          0
370123458000cd4f D2_WF_TRACKER_TEMPLATE           090123458005cdf5 4b01234580057910                                               0          0 01-JAN-01 01-JAN-01                               0             0          0

Clearly, there is a Data/Business problem. Two of the three Workflow Trackers have an exact duplicate relation for the same dm_process while the third one tracks two different dm_process, which is even more surprising.

To be able to proceed, I noted all the IDs and informed the Application/Business teams and then I did what the SQL command from the D2-DAR was supposed to do. Therefore, I took one of the 2 values and assigned it to the process_id of the Workflow Tracker:

SQL> update d2c_workflow_tracker_s 
       set process_id='4b01234580002575' 
       where (process_id IS NULL or process_id='0000000000000000') 
         and r_object_id='090123458004fdde';

1 row updated.



SQL> update d2c_workflow_tracker_s 
       set process_id='4b01234580057910' 
       where (process_id IS NULL or process_id='0000000000000000') 
         and r_object_id='090123458005fb44';

1 row updated.



SQL> update d2c_workflow_tracker_s 
       set process_id='4b01234580057910' 
       where (process_id IS NULL or process_id='0000000000000000') 
         and r_object_id='090123458005cdf5';

1 row updated.

After that, I re-triggered the query which was previously failing and this time it worked properly for all the remaining Workflow Trackers:

SQL> update d2c_workflow_tracker_s 
       set process_id=(select child_id from dm_relation_s 
         where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
           and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE') 
       where (process_id IS NULL or process_id='0000000000000000') 
         and exists (select child_id from dm_relation_s 
           where dm_relation_s.parent_id=d2c_workflow_tracker_s.r_object_id 
             and dm_relation_s.relation_name='D2_WF_TRACKER_TEMPLATE');

1400 rows updated.

I was then able to start the upgrade pipeline from scratch again, and it was able to complete properly. As previously stated, the Application/Business teams still didn’t find what caused these 3 rows, but it’s only a DEV, so I can only assume that it was some human errors at some points, like a wrong DQL being executed that duplicated some values or something similar. In any cases, what is interesting for this blog is the investigation process and the way to get to the bottom of things by searching inside the D2 DAR and analyzing the DB content.

L’article Dctm – D2-DAR installation fails because of workflow trackers est apparu en premier sur dbi Blog.

Design decision on database tables oracle DB

Tom Kyte - Sat, 2025-10-25 00:00
Hello experts, I have a UNIQUE constraint on (col1, col2) for a table. But, due to new functionality, I need a conditional unique constraint based on col 3. So something like this: if col3 = 'Val1': UNIQUE constraint on (col1, col2) else: UNIQUE constraint on (col4, col5) I'm on oracle DB 19c, and found that creating a unique index with case type helps. Can you guide me on alternative options and the pros, cons for this design. I do not want to create any new table and want the best way to achieve conditional unique constraints. Thanks
Categories: DBA Blogs

SQL developer hanging

Tom Kyte - Sat, 2025-10-25 00:00
I am using version 23.1.1 and when I start up the application, it just hangs. I have no idea what to look at to even begin to diagnose the problem.
Categories: DBA Blogs

Social Sign-In with Azure

Tom Kyte - Sat, 2025-10-25 00:00
Dear Experts, Social Sign-in with Microsoft Azure/Office 365 in Oracle APEX applications is working well. I used this How-To: https://tm-apex.hashnode.dev/implementing-social-sign-in-with-microsoft-azureoffice-365-in-apex When I use the substitution variable in APEX (&APP_USER.) I get the correct name. So far so good! But we need the samAccountName for checking Authorization. I tried #samAccountName#, #sam_account_name# and #sam#. It doesn't work! :( APEX is referencing in help the site https://openid.net/specs/openid-connect-basic-1_0.html#Scopes where I found other keys ("claims"). Not all are working, for instance #sub# and #family_name# works, #preferred_username# does not work. With the help of Google I found other keys like #upn# (https://promatis.com/ch/en/build-a-secure-oracle-apex-app-with-microsoft-azure-ad-login-and-delegated-calendar-access/) which works fine and is not mentioned in the above website. But my question to you is how I get the samAccountName from Azure??? What is the correct name/key/claim? May I have to configure other things than "profile,email" in scope textfield maybe?
Categories: DBA Blogs

Oracle returns default value for column on rows inserted before the column was added

Tom Kyte - Sat, 2025-10-25 00:00
<code>create table add_column_default ( id number ) / insert into add_column_default ( id ) values ( 0 ) / alter table add_column_default add col1 number default 5 / insert into add_column_default ( id, col1 ) values ( 11, null ) / select * from add_column_default order by id / ID COL1 ---------- ---------- 0 5 11 2 rows selected. drop table add_column_default /</code> <b>Assumptions:</b> My understanding is that the data block is not modified during the ALTER. That is, the row with id = 0 is not updated to add a col1 value of 5. https://docs.oracle.com/en/database/oracle/oracle-database/19/sqlrf/ALTER-TABLE.html#:~:text=If%20you%20specify%20the%20DEFAULT,subject%20to%20the%20following%20restrictions%3A <i>"If you specify the DEFAULT clause for a column, then the default value is stored as metadata but the column itself is not populated with data. However, subsequent queries that specify the new column are rewritten so that the default value is returned in the result set."</i> Note: This used to not be the case for nullable columns in 11.2. https://docs.oracle.com/cd/E18283_01/server.112/e17118/statements_3001.htm#CJAHHIBI:~:text=When%20you%20add%20a%20column%2C%20the,a%20default%20value%20or%20NULL. <i>"When you add a column, the initial value of each row for the new column is null. ... If you specify the DEFAULT clause for a nullable column, then the default value is added to existing rows as part of this ALTER TABLE statement, and any update triggers defined on the table are fired. This behavior also results if you change a NOT NULL column with a default value to be nullable."</i> My understanding is that the data block does not store any information (not even the length byte) regarding col1 when the value is null and col1 is the last column in the table. <b>Therefore:</b> Rows inserted before the ALTER do not have col1 information. Rows inserted after the ALTER may not have col1 information (inserting null into the last fixed width column in a table). <b>Confusion/Question:</b> If both rows look the same (with respect to col1 information) in the data block, then how does Oracle know to return a col1 value of 5 for the row with id = 0 and return a col1 value of null for the row with id = 11?
Categories: DBA Blogs

XMLTYPE returning unknown special character

Tom Kyte - Sat, 2025-10-25 00:00
Hello Sir, We're using Oracle Database 19C Enterprise edition. We're getting a XML tag in Varchar2 format. Then we're using XMLTYPE to convert that to XML. However, incase when there's special character in the XML tag it's giving a unknown special character in the output. What can we do to get the same output Sample SQL: <code>select XMLTYPE('<tag1> a''bc </tag1>') from dual;</code> Output: <code><tag1> a&apos;bc </tag1></code> Expected output: <code><tag1> a'bc </tag1></code>
Categories: DBA Blogs

Sql Plan Baseline

Tom Kyte - Sat, 2025-10-25 00:00
Hi Dear Experts, I want to create a SQL PLAN baseline for one of the My system query in a two-node RAC 19.21. With user SYS and inside a PDB with the following commands <code>var v_num number; exec :v_num:=dbms_spm.load_plans_from_cursor_cache(sql_id => '0b3...............',plan_hash_value => 2170529430 ,fixed=>'YES');</code> It gives a successfull response. Of course, the output of the above command is returned as 0. And when I check the Plan baselines with the following command, I see that nothing new has been created: <code>select * from dba_sql_plan_baselines</code> I check with the following command and see that plan_hash_value equal to 2170529430 exists in memory: <code>select sql_id, plan_hash_value, parsing_schema_name, sql_text from v$sql where sql_id = '0b3...............';</code> What is the problem?
Categories: DBA Blogs

Deleting duplicate records without using rowid and rownum

Tom Kyte - Sat, 2025-10-25 00:00
Hi Tom, If there is any duplications of records in a table, we know how to eliminate the duplicate rows using rowid. But is there any possibility to delete the duplicate records in a table without using rowid and rownum. my friend who is working in Oracle for arnd 4 years says that it is not possible. But i think there should be some way to do this. Pls give me some suggestion TOM. i would be thankful to you, if you can illustrate with examples. Thanks in Advance Prakash
Categories: DBA Blogs

Table Design

Tom Kyte - Sat, 2025-10-25 00:00
Hello, I work as a dba and get requests from developers to create tables which we often review and sometimes change for better design and/or implementation. One of the developers recently sent a request to create a table such as the following ; table_name:t1_relation Column_names: c1_master_id_pk (foreign key to t1_master table) c1_attribute c1_value primary key all 3 columns. They explained that the data in all these columns are all non nullable and that they are all needed to uniquely identify a record. having all the columns of a table as a primary key didn't look very right to me and so I suggested we create a surrogate key and make the 3 columns unique with not null constraints on all of them. they initially said yes then came back and said to change it to be how they requested initially. I'm messaging to ask if this is proper database design and what would be the ideal way to implement this?
Categories: DBA Blogs

AI for Oracle Security

Pete Finnigan - Sat, 2025-10-25 00:00
I spoke about AI and Oracle security back in February in this blog - AI and Oracle Security . I asked the question can we use AI (LLM / Generative AI) in Oracle Security and if so how? The two....[Read More]

Posted by Pete On 15/10/25 At 12:42 PM

Categories: Security Blogs

Can we Detect Disable Trigger in the Oracle Database

Pete Finnigan - Sat, 2025-10-25 00:00
If i want to create layered security in a database around something such as specific data or to protect access to a resource or perhaps control access to a privilege then we can create a security layer using standard features....[Read More]

Posted by Pete On 24/09/25 At 08:37 AM

Categories: Security Blogs

Fuzzing PL/SQL and Secure Design Patterns for PL/SQL and Oracle

Pete Finnigan - Sat, 2025-10-25 00:00
I have just been notified that I have had two sessions accepted for the UKOUG conference from 30th November to 2nd December 2025 at the East Side Rooms in Birmingham. My first session is called Fuzzing PL/SQL and is overviewed....[Read More]

Posted by Pete On 22/09/25 At 04:41 PM

Categories: Security Blogs

ORA-04063 Errors During Datapatch Rollback of Oracle Patch 30763851

Bobby Durrett's DBA Blog - Fri, 2025-10-24 13:50

I am working on an Oracle 19c database running on a RHEL 7 VM. I have been trying to roll back this patch:

Patch 30763851 IMPDP 11.2 TO 18C OR HIGHER HITS ORA-904 WHEN TABLES HAVE EXTENDED STATISTICS

I have it sitting on top of 19.5

Database Release Update: 19.5.0.0.191015 (30125133)

The Opatch rollback command ran fine, but Datapatch threw these errors:

[2025-10-22 18:50:15]   -> Error at line 11329: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11331: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11333: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11335: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11337: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11339: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11341: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11343: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_OPQTYPE_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11355: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_P2TPARTCOL_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11357: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_P2TPARTCOL_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11363: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_SP2TPARTCOL_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11365: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_SP2TPARTCOL_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11381: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11383: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11385: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11387: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11389: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11391: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11393: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11395: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11397: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_P2TCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11399: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_P2TCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11401: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_SP2TCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11403: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_SP2TCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11405: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11407: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_COLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11409: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"
[2025-10-22 18:50:15]   -> Error at line 11411: script rdbms/admin/dpload.sql
[2025-10-22 18:50:15]       - ORA-04063: view "SYS.KU$_PCOLUMN_VIEW" has errors"

I checked DBA_OBJECTS, and all the SYS objects are VALID. I tried querying one of the views and it worked fine. So, I went to My Oracle Support, our Oracle database support site, and searched for ORA-04063 and one of the view names and found nothing. A Google search also came up empty. I tried just ignoring it but that didn’t work. My whole goal in doing this was to apply the October 2025 patches that just came out this week. But because the SQL patch registry indicated that patch 30763851 rolled back with errors, every time I applied a new patch it would try to roll 30763851 back first and error again. Here is what DBA_REGISTRY_SQLPATCH looked like after two failed rollback attempts:

INSTALL_ID   PATCH_ID PATCH_TYPE ACTION          STATUS        
---------- ---------- ---------- --------------- --------------
         1   30125133 RU         APPLY           SUCCESS       
         2   30763851 INTERIM    APPLY           SUCCESS       
         3   30763851 INTERIM    ROLLBACK        WITH ERRORS   
         3   30763851 INTERIM    ROLLBACK        WITH ERRORS   
         4   30763851 INTERIM    ROLLBACK        WITH ERRORS   
         4   30763851 INTERIM    ROLLBACK        WITH ERRORS   

Each rollback attempt tried twice so I have four failures with two rollback attempts.

I opened a case with Oracle support just in case this was a known issue that wasn’t available for me to find on my own. Sometimes that happens. But while waiting on Oracle I kept trying to fix it myself.

The errors refer to $ORACLE_HOME/rdbms/admin/dpload.sql which I think reloads datapump after some change. It runs catmetviews.sql and catmetviews_mig.sql which have the CREATE VIEW statements for the views getting errors, like SYS.KU$_OPQTYPE_VIEW. But the code in catmetviews_mig.sql wasn’t straightforward. I imagined running some sort of trace to see why the script was throwing the ORA-04063 errors, but I never had to take it that far.

At first all this stressed me out. I thought, “I can’t back out this patch. I will never be able to patch this database to a current patch level.” Then I chilled out and realized that if it was a problem with Oracle’s code, they had to help me back out 30763851. But it might take some time to work through an SR with Oracle.

But what if it wasn’t an issue with Oracle’s code but something weird in our environment? I didn’t think it indicated a real problem, but there were some weird messages coming out that I am used to seeing. They were from triggers that come with an auditing tool called DB Protect. They were throwing messages like this:

[SYS.SENSOR_DDL_TRIGGER_A] Caught a standard exception: aliasId=100327, error=-29260, message="ORA-29260: network error: TNS:no listener"

We are used to seeing these errors when we do DDL but prior to this it didn’t cause any actual problems. We had already decommisioned the DB Protect tool but had not cleaned up the triggers. Dropping SYS.SENSOR_DDL_TRIGGER_A eliminated the ORA-04063 errors.

Probably no one will ever encounter this same issue, but I thought I would document it. If you have the same symptoms and you are not using DB Protect any more, do these commands:

DROP TRIGGER SYS.SENSOR_DDL_TRIGGER_A;
DROP TRIGGER SYS.SENSOR_DDL_TRIGGER_B;

I think the A trigger was the problem, but we don’t need either one.

Anyway, this post is just so someone who searches for ORA-04063 and one of the views will find this information and drop the triggers if they have them. It’s a long shot but might as well document it for posterity and for me.

Bobby

Categories: DBA Blogs

PeopleSoft Reconnect 2025 | Dive Deep

Jim Marion - Thu, 2025-10-23 11:07


The premiere PeopleSoft conference, Reconnect | Dive Deep, begins in just a few days. If you are not already registered for this live virtual conference, be sure to do so ASAP. If you are already registered, then be sure to log in to the conference app to build your agenda. Some sessions have capacity limits, and you don't want to miss them!

Here is the list of sessions I am presenting at this year's conference:

Monday

  • 4:45pm EDT P-051853, PeopleSoft Test Framework: More than Automated End-user Testing


Tuesday

  • 11:15 am EDT P-051448, Replacing Customizations with Configurations in Campus Solutions
  • 1:45 pm EDT P-051447, Enterprise Components: The "Other" Toolset
  • 4:15 pm EDT P-051446, PeopleSoft Fluid Best Practices


Wednesday

  • 12:30pm EDT P-051861, What's New in 8.62 for Campus Solutions!


Thursday

  • 12:30 pm EDT P-051442, Getting the Most out of PeopleSoft PeopleTools: Tips and Techniques
  • 1:45 pm EDT P-051445, Isolate and Configure: Don't Customize!

As a registered attendee, be sure to check out our virtual booth to watch replays from prior conference sessions. I look forward to seeing you online next week!

SQL Server: Do you know how to use SERVERPROPERTY()  function in a useful way?

Yann Neuhaus - Wed, 2025-10-22 04:04

The SERVERPROPERTY() function is so nice to use to have information about the instance and server very quickly. Sometimes it’s good to come back to basis…

How to begin? Read the documentation of course here but you will see a lot of informations…

After reading the documentation, how to have the first useful information?

The first information will be about the SQL Server installed like the server name, instance name, build and version with also some advanced information like last update date, the KB or CU.

Here this first query:

SELECT  
  SERVERPROPERTY('MachineName') AS [Server_Name],
  SERVERPROPERTY('InstanceName') AS [Instance_Name], 
  SERVERPROPERTY('Edition') AS [Edition],
  SERVERPROPERTY('ProductVersion') AS [Product_Version], 
  SERVERPROPERTY('ProductLevel') AS [Product _Level],
  SERVERPROPERTY('ProductBuild') AS [Product _Build],
   SERVERPROPERTY('ProductMajorVersion') AS [Advanced_Product_ Major_Version],
  SERVERPROPERTY('ProductMinorVersion') AS [Advanced_Product_Minor_Version],
   SERVERPROPERTY('ProductUpdateReference ') AS [Advanced_KB_Version],
  SERVERPROPERTY('ProductUpdateLevel') AS [Advanced_Update _Level],
  SERVERPROPERTY(' ResourceLastUpdateDateTime ') AS [Advanced_Last_Update_Date]

Result of a test:

I run all queries on my sql server container under Visual Studio Code and sql server 2025 CPT2.1.

You can have more info about it here (dbi Blog)

After, some other useful information will be if the instance has features installed like Polybase or Full-Text seach but also some configuration like filestream or HA.

 Here the query:

SELECT 
  SERVERPROPERTY('MachineName') AS [Server_Name],
  SERVERPROPERTY('InstanceName') AS [Instance_Name], 
  SERVERPROPERTY('IsPolyBaseInstalled') AS [Is_Polybase_Installed],  
  SERVERPROPERTY('IsFullTextInstalled') AS [Is_Full-Text_Search_Installed],
  SERVERPROPERTY('IsAdvancedAnalyticsInstalled') AS [Is_Advanced_Analytics_Installed],
  SERVERPROPERTY('FilestreamConfiguredLevel') AS [is_Filestream_enabled],  
  SERVERPROPERTY('IsBigDataCluster') AS [Is_BigData_Cluster_enabled],
  SERVERPROPERTY('IsClustered') AS [Is_Clustered],  
  SERVERPROPERTY('IsHadrEnabled') AS [Is_HADR_enabled],
   SERVERPROPERTY('IsTempDbMetadataMemoryOptimized') AS [Is_TempDB_for_Memory_Optimized_Tables_Enabled],  
  SERVERPROPERTY('IsXTPSupported') AS [Is_IN_Memory_OLTP_Supported],
  SERVERPROPERTY('IsExternalGovernanceEnabled') AS [Is_External_Governance_Enabled];

Result of a test:

You can notice that the last property is the External governance. This is linked to the  Microsoft Purview access policies.  

The next useful information will be about the security.

Here the query:

SELECT 
  SERVERPROPERTY('MachineName') AS [Server_Name],
  SERVERPROPERTY('InstanceName') AS [Instance_Name], 
  SERVERPROPERTY('IsIntegratedSecurityOnly') AS [Is_Integrated_Security_Only],
  SERVERPROPERTY('IsSingleUser') AS [Is_Single_User],
  SERVERPROPERTY('IsExternalAuthenticationOnly') AS [s_External_Authentication_Only];

Result of a test:

The External Authentication concerns the Microsoft Entra-only authentication  for Azure SQL Database & Azure SQL Managed Instance.

The last information package is the collation/character set with this query:

SELECT 
  SERVERPROPERTY('MachineName') AS [Server_Name],
  SERVERPROPERTY('InstanceName') AS [Instance_Name], 
  SERVERPROPERTY('Collation') AS [Collation], 
  SERVERPROPERTY('LCID') AS [Windows_Locale_Identifier], 
  SERVERPROPERTY('SqlCharSetName') AS [SQL_Character_Set_Name], 
  SERVERPROPERTY('SqlSortOrderName') AS [SQL_Sort_Order_Name];

Result of a test:

To conclude, you have now in this blog 4 useful queries to find information on the server level.
Don’t hesitate to add your comments or give me other useful property that you use.!

L’article SQL Server: Do you know how to use SERVERPROPERTY()  function in a useful way? est apparu en premier sur dbi Blog.

The "Retrieval" for RAG using Semantic Search

Hemant K Chitale - Mon, 2025-10-20 09:31

 Reusing the sample code from my previous demo, I build a table with 130 sentences describing AI Tools and then use SQL to run the sort of queries that you'd expect RAG to use against an "internal knowledge source".

Here's a preview of the sentences loaded :


SQL> select count(*) from my_data_source;

  COUNT(*)
----------
       130

SQL> select my_sentence from my_data_source fetch first 5 rows only;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
        Adobe Firefly: A family of generative AI models integrated into Adobe's Creative Cloud suite. It is used for professional image editing and graphic
 design

        AIVA (Artificial Intelligence Virtual Artist): An AI composer that generates music in various styles for content creators and brands
        Amazon CodeWhisperer: An AI coding assistant from Amazon Web Services that provides contextual code recommendations
        Canva Magic Media: An AI image generator integrated into the user friendly design platform. It is used for creating visuals for social media and pr
esentations

        ChatGPT: An AI chatbot from OpenAI. It can engage in human like conversations, write code, summarize text, and create content such as stories and e
ssays


SQL> select my_sentence from my_data_source where id > 126 ;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
        Zubtitle.com   captions for video
        Cohere.ai   large language models
        Grok.com   personal AI assistant
        Claude.ai   advanced AI chatbot

SQL>


The embeddings are then computed and loaded into the target table using the same query (the HuggingFace model ALL_MINILM_L12_V2_AUGMENTED has already been loaded  with the ONNX RunTime engine with the same code as in the previous blog post) :


insert into my_data_vectors
select ds.id as id, my_sentence, to_vector(et.embed_vector) sentence_vector
from
    my_data_source ds,
    dbms_vector_chain.utl_to_embeddings(
       dbms_vector_chain.utl_to_chunks(dbms_vector_chain.utl_to_text(ds.my_sentence), json('{"normalize":"all"}')),
       json('{"provider":"database", "model":"ALL_MINILM_L12_V2_AUGMENTED"}')) t,
    JSON_TABLE(t.column_value, '$[*]' COLUMNS (embed_id NUMBER PATH '$.embed_id', embed_data VARCHAR2(4000) PATH '$.embed_data', embed_vector CLOB PATH '$.embed_vector')) et
/


I now test a few "Retrieval" queries thus :  (Note how the queries return different results with smaller VECTORE_DISTANCE as I refine them) 

SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : Video
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'Video';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        Clipchamp.com   quick video creation
          4.635E-001

        Lumen5.com   AI powered social video
          5.277E-001

        Synths.video   convert blogs into videos
           5.39E-001


SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------



SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : What AI Tool can I use to edit Videos ?
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'What AI Tool can I use to edit Videos ?';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        Wisecut.video   auto edit with AI
           2.62E-001

        Runwayml.com   AI video editing
          2.777E-001

        Gling.ai   YouTube video editor
          3.413E-001


SQL>
SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------



SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : image
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'image';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        VanceAI.com   image enhancement
          5.268E-001

        Canva Magic Media: An AI image generator integrated into the user friendly design platform. It is used for creating visuals for social media and pr
esentations
          5.459E-001

        PicWish.com   photo editing
          5.696E-001


SQL>
SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------



SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : How can I use AI to edit Images ?
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'How can I use AI to edit Images ?';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        Hotpot.ai   AI image editing
          2.805E-001

        Runwayml.com   AI video editing
          3.711E-001

        Wisecut.video   auto edit with AI
          3.921E-001


SQL>
SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------



SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : Can I chat with an AI ?
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'Can I chat with an AI ?';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        Claude.ai   advanced AI chatbot
          2.989E-001

        Claude.ai   conversational assistant
          3.828E-001

        Grok.com   personal AI assistant
          4.025E-001


SQL>
SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------



SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : coding
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'coding';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        Amazon CodeWhisperer: An AI coding assistant from Amazon Web Services that provides contextual code recommendations
          5.543E-001

        Replit.com   write and run code
          5.548E-001

        GitHub Copilot: An AI coding assistant that provides code suggestions and autocompletion within an IDE
          5.677E-001


SQL>
SQL>
----------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------


SQL> ACCEPT text_input CHAR PROMPT 'Enter your query : '
Enter your query : Tell me of a Coding Assistant
SQL> VARIABLE text_variable VARCHAR2(1000)
SQL> VARIABLE query_vector VECTOR
SQL> BEGIN
  2    :text_variable := '&text_input';
  3    SELECT vector_embedding(ALL_MINILM_L12_V2_AUGMENTED USING  :text_variable as data) into :query_vector;
  4  END;
  5  /
old   2:   :text_variable := '&text_input';
new   2:   :text_variable := 'Tell me of a Coding Assistant';

PL/SQL procedure successfully completed.

SQL>
SQL>
SQL> SELECT my_sentence, vector_distance(sentence_vector , :query_vector, COSINE) as Calc_Vector_Distance
  2  FROM my_data_vectors
  3  ORDER BY 2
  4  FETCH FIRST 3 ROWS ONLY;

MY_SENTENCE
----------------------------------------------------------------------------------------------------------------------------------------------------
CALC_VECTOR_DISTANCE
--------------------
        GitHub Copilot: An AI coding assistant that provides code suggestions and autocompletion within an IDE
          4.306E-001

        Amazon CodeWhisperer: An AI coding assistant from Amazon Web Services that provides contextual code recommendations
          4.702E-001

        Replit: An online IDE with an integrated AI assistant that helps with code generation, debugging, and project creation
          4.734E-001


SQL>
SQL>

Thus, in the first pair of examples, if I refine (aka "prompt engineering") my query from "Video" to "What AI Tool can I use to edit Videos ?", I get a better set of responses with  a smaller VECTOR_DISTANCE from my query.

In the second pair, I change my query from "image" to "How can I use AI to edit Images ?"  I get an improved set of responses (again with a smaller VECTOR_DISTANCE).

In the last pair, when I change  my query from "coding" to "Tell me of a Coding Assistant", I get responses with a better match to my query.

As noted in my previous blog post, the "LLM" is actually a "MinLM" called "all-MiniLM-L12-v2" from HuggingFace.

Categories: DBA Blogs

Alfresco – Documents not searchable when uploaded via REST-API?

Yann Neuhaus - Thu, 2025-10-16 13:29

It’s a pretty common (and should be known) issue in Alfresco but even to this day, it still happens that enterprises contact us because of documents that cannot be found. The installation in question is a docker-compose for Alfresco Community 23.4, managed by the customer directly. Only some slight customizations were done but it’s a pretty out-of-the-box setup for the search part. There is an interface that uploads some documents (using a custom type) into a specific Site using the REST-API but each of these documents is “apparently” not searchable in Alfresco Share. When downloading and re-uploading manually through Share (either default or custom type), the customer was saying that the documents can indeed be found. So, what is going on here?

As mentioned, this is a pretty common problem. No latter than last week, there was again discussions around that topic on the Alfresco Discord channel, and yesterday at that customer, they were also facing that exact same thing. Therefore, I think it’s high time for me to write a blog on that topic, to share a bit of experience and hopefully reduce the amount of people that are impacted, probably even without knowing it. I will use a local development environment to demonstrate the issue and show how to fix it.

The Problem

So, what is the issue? In short, it’s linked to Solr6 cross-locale, i.e. a language problem. When using Solr6 with Alfresco, by default, the cross-locale configuration is commented out, which means it is effectively disabled. When someone or something upload a document to Alfresco, it will be assigned a property “sys:locale” that is the language used by the client that created said document. If it’s a web browser, then it would be assigned the language configured on your browser. If it’s a command line request, it would be the OS session language, etc… Then, when users are trying to search for that document, the locale of the user is also assigned to the search request and if it doesn’t match the initial locale (the one used to create the document), then you will probably never be able to find anything.

Unfortunately, Solr6 for Alfresco is assuming that everybody speak and uses only one language, which is definitively not true for a lot of large-scale or international enterprises or even smaller ones but in countries where there are multiple official languages, like Switzerland.

Environment Preparation

In the environments that I install, I always configure the cross-locale support. So, first of all, I will need to revert to the default Solr6 configuration. More details on that will be in the solution section below. With a default Solr6 installation/index, I will create a custom type with a single property that I will use to perform the searches (it’s not mandatory, you can use the default type/properties too). You can also use Full-Text Search, but it’s probably easier for me to showcase the problem using a very specific property.

Therefore, I created a custom type (dbi:demo_doc) and a property (dbi:demo_prop) using the model manager. “dbi:demo_prop” is a simple text property that is mandatory, indexed as “Free Text” (c.f. the documentation here for more details) and I didn’t set any constraints for this example but it could have had one. I will just use values such as “NB-[0-9]{4}” (so NB-0000, NB-0001, etc.):

Showcase – Web Browser in English

Now that I have a cleaned and out-of-the-box Solr6 index and my property to play with, let’s start with creating a first test document in a first browser with an English display. Assigning this document to my custom type “dbi:demo_doc” and using the value “NB-0001” for my property “dbi:demo_prop“:

As you can see above, the “sys:locale” has been set to “en_US” (coming from my browser settings) and please also note the DBID 1519 for later.

Then on that same browser, the document is properly searchable, there is 1 result, so far so good:

Note: I’m specifically not adding an “=” at the beginning of the search term so that it goes to Solr. If you have TMDQ (Transactional MetaData Query) enabled and if you use an exact match for a property, then it will normally use the database to answer the search, which would defeat the purpose of this blog…

Showcase – Web Browser in French

Now let’s proceed with opening a second browser, that is configured with French display this time and doing the exact same thing with a new document:

As you can see above, the “sys:locale” has been set to “fr” this time, for the same document, simply from another browser with a different display language (which you can confirm on above screenshots as everything is written in French). Same as before, please also note the DBID 1523 for later.

Then on that same browser, the document itself is also searchable, but in normal cases, we would have expected 2 results, no…? There is only one here:

This is the issue I’m talking about. Because the cross-locale is disabled, the document imported with a different locale isn’t found when you are searching for some text on content or properties.

It’s pretty obvious with the above already, but if you really want to confirm that the documents are indexed inside Solr, you can do something like that on the Solr UI “q=dbi:demo_prop:’NB-0001′” and it will give you 2 results, 2 documents with DBID 1519 as well as 1523. If you remember or if you look at the above screenshots, you can see that’s the values of the “sys:node-dbid” property for the 2 documents. Therefore, these are indexed in Solr:

How to Solve

So now, how do we fix that? Fortunately, it’s not very complex, but depending on your repository size, it might take some time as it will require to enable the cross-locale and then to perform a complete reindex… Since I’m using a nearly empty development environment, it will be very fast for me.

On the Solr host, let’s start with checking the cross-local configuration:

[solr@solr01 ~]$ cd $SOLR_HOME/solrhome/conf/
[solr@solr01 conf]$
[solr@solr01 conf]$ grep "cross.locale.datatype" shared.properties
#alfresco.cross.locale.datatype.0={http://www.alfresco.org/model/dictionary/1.0}text
#alfresco.cross.locale.datatype.1={http://www.alfresco.org/model/dictionary/1.0}content
#alfresco.cross.locale.datatype.2={http://www.alfresco.org/model/dictionary/1.0}mltext
[solr@solr01 conf]$

As you can see, the 3 lines for cross-locale configuration of datatypes are commented out by default (by the way, it’s also indicated in the documentation here). Therefore, let’s simply uncomment these and then we will scratch the index so that Solr6 can perform a full reindex:

[solr@solr01 conf]$ sed -i 's,^#\(alfresco.cross.locale.datatype\),\1,' shared.properties
[solr@solr01 conf]$
[solr@solr01 conf]$ grep "cross.locale.datatype" shared.properties
alfresco.cross.locale.datatype.0={http://www.alfresco.org/model/dictionary/1.0}text
alfresco.cross.locale.datatype.1={http://www.alfresco.org/model/dictionary/1.0}content
alfresco.cross.locale.datatype.2={http://www.alfresco.org/model/dictionary/1.0}mltext
[solr@solr01 conf]$
[solr@solr01 conf]$ sudo systemctl stop solr
[solr@solr01 conf]$
[solr@solr01 conf]$ ls -l $SOLR_DATA_HOME/
total 4
drwxr-x---. 4 alfresco alfresco   37 Oct 16 12:53 index
drwxr-x---. 2 alfresco alfresco 4096 Oct 16 12:54 models
[solr@solr01 conf]$
[solr@solr01 conf]$ rm -rf $SOLR_DATA_HOME/*
[solr@solr01 conf]$
[solr@solr01 conf]$ sudo systemctl start solr
[solr@solr01 conf]$

Once the documents are indexed, you can go back to the English or French browsers and execute the previous search, and voila, 2 results:

You can now create new documents with any locale, and it should be transparent from a search perspective. If you would like to see which locale is being sent during Search requests, you can enable DEBUG logs for “org.alfresco.repo.search.impl.solr.SolrQueryHTTPClient” and something like that should appear:

2025-10-16 15:35:29,732  DEBUG [impl.solr.SolrQueryHTTPClient] [http-apr-8080-exec-39]    with: {"tenants":[""],"locales":["en_US"],"defaultNamespace":"http://www.alfresco.org/model/content/1.0","textAttributes":[],"defaultFTSOperator":"AND","defaultFTSFieldOperator":"AND","anyDenyDenies":true,"query":"dbi:demo_prop:'NB-0001' ","templates":[{"template":"%(cm:name cm:title cm:description ia:whatEvent ia:descriptionEvent lnk:title lnk:description TEXT TAG)","name":"keywords"}],"allAttributes":[],"queryConsistency":"DEFAULT","authorities":["GROUP_EVERYONE","ROLE_ADMINISTRATOR","ROLE_AUTHENTICATED","admin"]}
2025-10-16 15:35:29,732  DEBUG [impl.solr.SolrQueryHTTPClient] [http-apr-8080-exec-39] Got: 2 in 47 ms

Happy searching!

L’article Alfresco – Documents not searchable when uploaded via REST-API? est apparu en premier sur dbi Blog.

Reflections from Oracle AI World 2025: Innovation, Community, and the Road Ahead

DBASolved - Thu, 2025-10-16 13:15

Discover the major announcements from Oracle AI World 2025, including the transformation to Oracle Database 26ai, GoldenGate's AI-ready capabilities, and the game-changing OCI GoldenGate on Azure integration. Learn how Oracle is architecting AI into the core of data management and why these innovations position organizations for the AI revolution.

The post Reflections from Oracle AI World 2025: Innovation, Community, and the Road Ahead appeared first on DBASolved.

Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator