Skip navigation.

Andrejus Baranovski

Syndicate content
Blog about Oracle technology
Updated: 10 hours 44 min ago

ADF Query Design Revisited with ADF 12c Panel Drawer

Sat, 2014-04-12 12:22
My goal of this post is to take a closer look into ADF 12c and check how ADF Query layout can be optimised, using Panel Drawer component. There are several items, sample application is focusing on:

1. Panel Drawer component in ADF 12c and its usage for ADF Query

2. Declarative mode support for VO Query optimisation

3. Dynamic bindings in Page Definition for ADF Query

4. View Object query logging

Here you can download sample application - This is how UI looks initially, when Employees fragment is opened:

You should notice magnifying glass icon on the top right, this is rendered by ADF 12c Panel Drawer component. User could click on this icon and ADF Query would slide and become available, click on the same magnifying glass or anywhere else on the page - it will slide out automatically. This is really convenient, as it allows to save space on the screen - ADF Query is rendered on top of other components in a drawer:

Obviously, you could have as many panel drawers as you would prefer - this is not only for ADF Query. So, as you can see in the screenshot above, search is executed. There are only three columns in the results table - SQL query is generated accordingly with a key and the attributes for these three columns. This is a feature provided by VO declarative mode:

Move to the second accordion group and search for different value - this group displays more columns:

SQL query is different as well - it includes now all visible attributes. Such search implementation is especially useful for the cases, where many attributes must be displayed in the result set. It makes sense first to display result set with several attributes only and give user an option to see all the attributes using additional table. This would mean, initially SQL query would be lighter and it would fetch all attributes later, as in this example:

Let's check now technical details. VO is set to run in declarative mode and all the attributes (except primary key) are marked with Selected in Query = false. This allows to calculate displayed attributes on runtime, based on ADF bindings and construct SQL query accordingly:

There is one hidden gem in the sample application - generic class to provide detail logging for executed SQL queries and rows fetched. You could use it immediately in your project, without any change:

ADF Query is integrated into Panel Drawer using regular ADF Faces Show Detail Item component, there is custom magnifying glass image set:

Each accordion item, where results are displayed, is set with disclosure listener - this is needed to update current context and apply ADF Query to filter results either in Preview or Complete accordion items:

Accordion item is configured with JSF attribute, to identify itself:

Accordion item disclosure listener is updating currently opened item index, this will be used in ADF bindings - to map dynamically ADF Query binding with proper results iterator:

ADF Query in ADF Bindings is represented by Search Region. Unfortunately, Search Region property Binds doesn't work with dynamic expression. I resolved this with one trick - have defined new iterator, where Binds property is calculated dynamically (using current accordion item index). Search Region points to this this artificial iterator, and iterator in turn points to the VO instance. Both results tables are pointing two the same VO instances:

You can spot attribute definitions in the table bindings - VO declarative mode decides which attributes to include into SQL query, based on these attribute bindings defined.

MDS Seeded Customization Approach with Empty External Project

Sat, 2014-04-05 11:27
Great feature in ADF - MDS Seeded customisations support. This is particularly useful for independent software vendors, who are developing their own products on top of ADF framework. With MDS Seeded customisation, maintenance for multiple customers becomes easier. We could create customisation, instead of changing original source code - this makes it easier to maintain a product. I would like to share one important hint, related to technical architecture for MDS Seeded customisations - this will be related to the way how MDS Seeded customisations are organised and maintained. By default, you would create MDS Seeded customisation files in the original application, however this is not very clean approach. There is a way to create and keep all MDS Seeded customisation files in empty external application. I will describe in the post, how this can be achieved with a few easy steps.

If you are going to test sample application -, with integrated WLS instances in JDeveloper, make sure to read this post (it describes how to setup local MDS repository in file system) - How To Setup MDS Repository for Embedded WLS Instance.

Let's start - you can download initial version of sample ADF application from the blog post mention above. Employees form and empty Submit button:

I don't want to create any MDS Seeded customisation files inside, rather I would build and deploy ADF library out of main application:

Sample comes with special application - CustomizationApp (you can find it in the archive). This application was created to keep MDS Seeded customisation files, no other purpose. Initially, empty project was created - where ADF library was imported (the one we have just deployed):

Empty project is enabled with MDS Seeded customisation support:

Restart JDeveloper in customisation mode - so we could create some customisations for the content from imported ADF library:

If MDS Seeded customisation mode was successfully applied, you should see some special icon, next to the application name. Choose 'Show Libraries' to see a list of libraries - so, we could see contents from imported ADF library:

All attached libraries will be displayed, you should locate our ADF library. Expand it and you should see application packaging:

We could apply several customisations now. Let's open Employees VO and define View Criteria (filter employees by salary >= 1000). This customisation will be stored inside our empty project:

There will be a change in AM - View Criteria will be applied for VO instance:

We could go and review XML files for applied MDS Seeded customisations. There is one for VO and AM. XML for AM contains delta information about applied View Criteria for VO instance:

One more customisation on UI level now - drag and drop Commit operation for Submit button:

This change creates two additional XML files with MDS Seeded customisations - for JSF fragment and Page Definition:

You must define MDS Seeded customisations deployment profile (MAR) for application with empty project (containing XML's):

Development is completed, now will be a last bit - deployment. Make sure WLS is started (you should start it separately, if you want to test MAR profile deployment):

Go ahead and deploy main application first - you should get a list of MDS repositories (see a hint how to define local test repository in the blog post mentioned above):

Once main application was deployed, you can apply MDS Seeded customisations and export them through a MAR file from external application:

You should see main application name in the wizard, MDS Seeded customisations will be applied for this application:

All the changes applied through MDS Seeded customisations, will be visible in the log:

Good point - there is no need to restart main application, after MDS Seeded customisation changes were applied. Here you can see, original application with changes as per above:

If applied changes should be removed, this could be simply removed from MDS Seeded customisation XML file and re-applied. Main advantage of this approach - no need to store XML files with MDS Seeded customisations inside original project, we could keep them outside.

Hide All Search Operators for ADF View Criteria Item

Fri, 2014-04-04 14:51
During this week work, I received interesting question from ADF developer - how to hide all operators for specific ADF View Criteria item. There is an option to go and hide different operators one by one, following API guide for JboCompOper. You could follow this approach, but there is one smarter way - to hide all operators at once. See below - how.

Sample application is available for download - This application implements View Criteria with several items to search for (FirstName, LastName and Salary):

Search operators are configured to be displayed in both Basic and Advanced modes:

This is how it looks on UI - ADF Query with standard list of search operators:

I would like to hide all search operators for FirstName and LastName attributes. This is pretty easy - what we need to do, is to open VO attribute and define CompOper tag. If you want to hide all search operators, simply provide "*" for Name and Oper properties. Set "-2" for ToDo property (this is OPER_TODO_CLEAR, meaning - remove all operators).  Name and Oper are required properties, but we could provide "*" sign, indicating it will be applied for all. Repeat the same for FirstName and LastName attributes:

Search operators will be hidden now for FirstName and LastName items in both modes - Basic:

And Advanced mode:

Sounds pretty easy, isn't it? You should remember a trick with "*" symbol.

ADF 11g PS6 Table Pagination and Displaying Selected Row Issue - Solution

Fri, 2014-03-28 15:33
While ago, I had a blog post about new feature in ADF 11g PS6 ( - table pagination support. There is an issue, when we want to open specific row and display it automatically in the table - required table page for the selected row is not opened correctly. However, blog reader suggested a fix, received from Oracle Support. Blog reader was kind enough, to post a comment with suggested fix, you can read it here - JDev/ADF sample - ADF 11g PS6 Table Pagination and Displaying Selected Row Issue. I decided to test this fix myself and provide updated sample application. The fix is to use range start from the iterator and set to it for the first property of the table with pagination. Actually, this fix does the job, but not completely perfect. Current row is displayed, only if Range Size for the iterator is set to 25, probably there is some hard coding somewhere. Ok, but at least it works.

Download sample application - This application contains two fragments, second fragment with the table is opened from the first - where current row is selected. In the first fragment, we call setPropertyListener for navigation button and save required information in pageFlowScope (to be used in the second fragment):

This information - range start, once we move current row in the row set, range start is also changed. We are going to use range start, to set first property for the table - in such way, we could force table to display required page with selected row:

Here you can see, how table first property is set - we are using range start saved in pageFlowScope in the first fragment. This would force ADF table with pagination to display required page of rows:

Let's see how this works. Select a row belonging to the first range (I have configured Range Size to be 25):

Table with pagination is loaded and selected row is displayed in the first page, this is correct:

Navigate to some row from the second range (this should be after first 25 rows):

Selected row is displayed in the second page, as expected. There is one small issue - selected row is displayed at the bottom, while it should be somewhere in the middle. Well, this is another issue related to ADF table pagination:

If you navigate back to the first page and then again navigate to the second page - selected row will be displayed correctly in the middle:

In my opinion, ADF table pagination is not yet very tested and stable feature. Perhaps we should wait for improvements in the next release, until using it in the complex scenarios.

Red Samurai Performance Audit Tool v 3.0 - Getting Smarter

Thu, 2014-03-27 11:47
Our ADF Performance Audit tool is growing and getting smarter. Current release v 3.0 is focusing on collected audit data reporting effectiveness. There were many features added since early release in 2012 - Red Samurai Performance Audit Tool - Runtime Diagnosis for ADF Applications. You can check features added in 2.8 release - Red Samurai Performance Audit Tool v 2.8 - Activation Focus.

Why v 3.0 release is smarter? Because it can tell ADF application health. We are collecting and analysing multiple ADF application performance metrics and calculating LED status - red, yellow or green to indicate ADF application health (see top right corner):

Application health indication can be drilled down to see more detailed metrics about ADF application performance:

My favourite new feature - performance analysis monitor. We display full history for performance issues, logins, queries and transactions. It is very easy to see and compare ADF application performance in time:

We are not only reporting data, but also analysing it. We apply special algorithm to smooth data and display trends in ADF application usage and number of issues:

This helps to understand, if ADF performance issue are result of higher load on the system or ADF BC is not tuned.

We also display total distribution of activations per each AM - this helps to understand AM's under heavy activations and tune ADF BC configuration:

ADF Alert - Facelets Vulnerability in ADF 11g R2 and 12c

Wed, 2014-03-26 06:18
If you are running your application in ADF 11g R2 or 12c environment and using facelets - you should double check, if a source code for the facelet pages is not accessible through the URL. There is another security vulnerability in ADF 11g R2, documented here - Alert for ADF Security - JSF 2.0 Vulnerability in ADF 11g R2. Apparently this is a patch from Oracle for JSF 2.0 vulnerability and also there is a manual fix. However neither patch or manual fix are not applied by default, potentially your source code could be exposed for public access.  This is why I post it on the blog - for all ADF users to be aware.

I don't have solution for vulnerability described in this post, you should contact Oracle support and ask for a patch. To reproduce this vulnerability is pretty easy - you could remove "faces" from URL and try to access your page (for example main.jsf), source code for the page will be loaded.

Sample application  - was tested with ADF 11g R2 and 12c runtime.

It doesn't help to set .jsf extension name in web.xml context parameter, as it does for ADF security vulnerability described in the previous post:

When we reference ADF web page with "faces" in the context root, page content is rendered as expected:

However, if you remove "faces" in the context root and try to access main.jsf - instead of returning error, ADF 11g R2 runtime will bring main.jsf page source code (a bit unexpected, right?):

The same with ADF 12c runtime:

Update from Oracle Support: Patch CVE-2013-3827 is available for this issue in October 2013 CPU.

Alert for ADF Security - JSF 2.0 Vulnerability in ADF 11g R2

Sun, 2014-03-23 05:06
You must be concerned about your system security, if you are running ADF runtime based on ADF - versions. These versions are using JSF 2.0, with known security vulnerability - Two Path Traversal Defects in Oracle's JSF2 Implementation. This vulnerability allows to download full content of WEB-INF through any browser URL. There is a fix, but this fix is not applied by JDeveloper IDE automatically, when creating new ADF application. To prevent WEB-INF content download, you must set javax.faces.RESOURCE_EXCLUDES parameter in web.xml - make sure to provide all file extensions, you want to prevent to be accessible through URL.

By default, when vulnerability fix is not applied, we can access WEB-INF content using similar path: http://host:port/appname/faces/javax.faces.resource.../WEB-INF/web.xml. Unless you want to allow your users to download the source code, make sure to apply the fix in web.xml manually:

My test case - (this sample comes with vulnerability fix disabled - default version) is implemented with JDeveloper, I will demonstrate how to reproduce JSF 2.0  vulnerability with this version:

Test case consists of two basic applications, one of them is packaged as ADF library:

ADF library is imported into main sample application:

Two reproduce vulnerability is pretty easy - run main sample application, login with redsam/welcome1 user:

Default URL is generated in the browser and first page is rendered - all good so far:

1. web.xml vulnerability

Remove main page name and after faces/ type javax.faces.resource.../WEB-INF/web.xml, you will access web.xml content:

2. weblogic.xml vulnerability

Access content with javax.faces.resource.../WEB-INF/weblogic.xml:

3. Local ADF Task Flows vulnerability

Access in WEB-INF, using Task Flow path and name:

4. ADF Library JAR vulnerability

All ADF Library JAR's by default are packaged into WEB-INF folder, this means we could download these JARs and get entire code. You only need to type JAR file name. It is possible to get JAR file names from ADF BC configuration file, for such JAR's imported into ADF Model:

5. adfm.xml configuration file vulnerability

Here we can get a list of DataBinding files:

6. DataBindings.cpx file vulnerability

We have a list of DataBindings files from previous step. No we could open each DataBindings file and get a list of pages/fragments together with Page Definition mappings. We can read path information for ADF BC:

7. ADF BC vulnerability

Based in ADF BC path information from previous step, we could access Model.jpx file and read information about ADF BC packages:

8. ADF BC configuration vulnerability

We could go down and download every ADF BC component - EO/VO/AM. From bc4j.xcfg we can read info about each AM configuration, data source name, etc.:

Shortcut to Call Custom View Row Method from JSF Expression

Thu, 2014-03-20 00:10
There is a custom method in Generic View Row Implementation class and you need to invoke it from UI. What would you do? Most likely you would generate Java View Row Implementation class for the specific VO, publish custom method through the interface and later consume it through ADF bindings. This works, but there is a shortcut - especially well working for generic solutions.

Sample application -, implements a table with row status displayed in each row. Once user is changing data, row status is updated - this is the use case, to display row status generically for each row:

We can see from the log - row status is evaluated for each row:

Such check is implemented without a method binding in Page Definition, instead we are accessing current row (available in the table context) and from the row getting row object (the one that represents actual VO row). Next from row object, invoking our custom method, implemented in generic View Row Implementation class - row.row.checkRowStatus:

There is a generic View Row Implementation class, View Object in the sample app extends from this class:

Generic View Row Implementation class contains our custom method, invoked from ADF UI - getCheckRowStatus:

Pay attention, in JSF expression, get part is omitted from the method name: row.row.checkRowStatus.

I believe, this is simple, but very effective technique.

How To Setup MDS Repository for Embedded WLS Instance

Thu, 2014-03-13 14:58
Recently I was enabling external MDS repository for ADF MDS Seeded Customizations, I was facing issues while testing such MDS repository with my local embedded WLS instance - running it directly from JDeveloper. I managed to find a solution at the end, so I would like to share it with you.

I was doing same thing as to configure MDS support for ADF Query Saved Search - defining persistence config in add-config.xml file. This config allows to map your ADF application, enabled with MDS, with MDS repository during deployment:

With persistence config in adf-config.xml present, while deploying sample application -, to the embedded WLS server instance:

You will get MDS repository configuration wizard screen, where you could choose MDS repository name, type and provide partition name. However, by default MDS repository is not present on embedded WLS and we can't really define MDS repository:

Good news - to define MDS repository for the embedded WLS instance is quite easy. Of course, this MDS repository is supposed to be used during development only, not during production. During production you should use DB based MDS repository configured on the stand alone server. While locally, we could create pretty basic file based MDS repository. This can be done in Persistence Stores section, under Services - choose to create new File based repository. Only things you need to set: name and directory. See below my example:

Once you press OK, MDS repository is created in the file system:

Try to deploy sample ADF application again - you should get MDS repository name in the list. This is file based repository and you could define MDS partition name. Under this partition will be stored all MDS documents for the current application:

Why You Don't Want to Code Validation in Before Commit

Thu, 2014-03-06 13:37
You should know by now - there are many things possible in ADF, but it doesn't mean every solution is right, even if it works. One example of such case - coding validation rules in beforeCommit method. This method is invoked after all changes are posted and ADF BC assumes data is valid, if we throw later validation error from beforeCommit - ADF BC state remains unchanged and changed data is not submitted again. There is a workaround to set jbo.txn.handleafterpostexc=true and to force in memory passivation snapshot with subsequent activation on validation error - however, this is a big performance hit. Every time, where there will be validation error - rollback will be executed and entire AM with all VO instances will be re-activated (SQL re-executed and data re-fetched). Today post is about bad practice, to demonstrate why you should not code validation in beforeCommit method.

Sample application -, implements validation rule in EO beforeCommit method. Validation rule calls PL/SQL function and throws exception if validation result is false. It validated salary value from the current row (PL/SQL function code is attached in sample application):

PL/SQL function checks if salary value is lower than 2000:

Sample application is configured with DB pooling. By default, without DB pooling set - DB lock will not be removed after exception from beforeCommit:

To test beforeCommit validation behaviour, open the form and set Salary value below 2000. Change also another field - FirstName, for example:

As expected, there will be validation error and exception thrown - error message displayed:

We can see the sequence of steps happening in the log. There is DB lock for the current row, data is posted successfully for both Salary and FirstName attributes, then exception happens:

Ok, we can fix Salary attribute value and set it to be higher than 2000. Keep the same value for FirstName attribute, it is not committed yet to DB and we expect it to be committed now:

What we can see in the log now - Salary attribute value was posted and committed, as validation passed successfully. However, second changed attribute - FirstName, value was not committed. This is because, ADF BC thinks FirstName was committed during previous commit, when it actually failed. As exception was thrown in beforeCommit, this is already after ADF BC marks data as valid and assumes to be committed to DB. This is the main issue with coding validation in beforeCommit - ADF BC transaction state remains invalid:

If you press Rollback - you will see, FirstName will be reset to the previous value:

There is a workaround for this case, to use jbo.txn.handleafterpostexc = true property in AM configuration:

However, this leads to runtime performance issues - as it fires passivation in memory on each commit and does rollback with re-activation on every validation error in beforeCommit. We can track all such activations, fired from beforeCommit, by overriding activateStateForUndo method in AM:

We can test the same scenario - change LastName and Salary attributes, set Salary to be less than 2000:

Now we can spot - there was activation event raised:

During activation, all VO's are re-executed and re-fetched. Here we can see SQL query and data fetch for Employees:

There is SQL query and data fetch for Departments, even we didn't touch it:

Improving ADF UI Table CRUD Functionality with Auto Focus

Sun, 2014-03-02 09:32
Improving and tuning ADF applications performance, doesn't mean only ADF framework technical parameters tuning. Performance tuning could be applied to application UI behaviour. I will use ADF Faces table example in this post, by default when new row is inserted - focus for the new row column is not set, user needs to do one extra click to set focus and start entering data. There is a way to eliminate this extra click and set focus automatically for a column in ADF table row. I will describe generic method to achieve such functionality.

Let's see first, how it works. We have regular ADF Faces table:

Press Create button to insert a new row:

By default, when new row is inserted - focus in the new row column is not set, user needs to do one extra click and set focus by himself. This is improved in my sample application -, focus in the new row column is set automatically after Create button is pressed (it is configured to set focus in the first column):

It is important to understand, while setting focus for ADF UI table row columns, each row in HTML is assigned with certain ID. This ID is appended to the original inputText component ID, basically the same inputText component is created as many times, as there are rows in the table. Here you can see, new row was assigned with index 25, this is index is not related to row number:

Focus ID - inputText component ID in the first column of a new row. This points to a region, panelCollection, table, row ID and inputText component ID's.

I create one more row:

Focus is set automatically:

This time, row ID was different - I will describe below, how such ID can be retrieved in generic method:

From implementation point of view, there is Create button with JSF attributes. One of the attributes, sets ID of inputText for the focus, second provides action name to be executed. Focus is set through JavaScript function:

There are few changes needed for ADF UI table properties. Make sure to set contentDelivery="immediate" and displayRow="selected" properties, this is needed to ensure focus will be set correctly every time, even when user scrolls through the table:

Sample application implements generic action lister, it reads method to execute from component attribute, as well as UI component ID to apply auto focus. We are locating table binding automatically, getting Client Row Key (this is row ID, I was mentioning in the example above) and constructing final ID to use for the focus. As you can see, method action is executed first, and auto focus code afterwards (we need to have new row inserted initially):

At last, JavaScript method is invoked from Java, to set focus in the row identified by row ID number.

Red Samurai Performance Audit Tool v 2.8 - Activation Focus

Tue, 2014-02-25 13:18
Here we come again - new minor update for Red Samurai Performance Audit Tool is prepared in our labs. Update 2.8 is focused on more accurate ADF BC activation time tracking. As you may already know, from my previous blog post - ADF BC Performance - View Object Instance Lazy Activation, ADF BC may apply deferred activation for VO instance. If instance is not used in the current Page Definition, this instance will be activated later, when user opens a page or fragment with such VO instance. For more technical details and sample application, please read my blog post mentioned above. We are keeping this behaviour in mind, when logging slow activations, this allows to understand real impact of slow activations and display it in the audit reports.

Drill down graph for slow activations analysis is updated with search functionality, you can browse through activations using different search criterias (such as time, data, VO's involved, etc.):

I'm filtering here all slow activations reported by date. I can see there are many slow activations reported with time close to 0 - 30 seconds. I could restrict my search criteria by filtering activation time in the interval of 0 - 30 seconds:

It is much easier now to analyse and drill down into group of slow activations.

The size of the point reflects number of VO instances involved into activation group. 33 VO instances are involved in this specific case, other details as time, records fetched, etc. are displayed as well:

We have planned several new exciting updates in the future. Read here about features included in previous update - Update for Red Samurai Performance Audit Tool - v 2.4.

ADF BC Performance - View Object Instance Lazy Activation

Thu, 2014-02-20 01:11
ADF BC is a great framework to manage data, but one thing it does really smart - ADF BC View Object instance lazy activation. You must know by now - passivation/activation events in ADF BC are relatively expensive in terms of performance. Especially expensive is activation event, as during this event View Object instance is re-constructed and data is re-fetched. However, it is not as bad as it sounds - it doesn't activate all View Object instances from Application Module together. Only View Object instances referenced from current Page Definition are activated. This means, if user was working with 10 screens, before passivation event happened - during activation event, only View Object instances from current screen will be activated. View Object instances from other screens will be activated, when actual screen will be accessed. Good news - you don't need to tune anything for this, this is how it works by default.

We track ADF BC performance with Red Samurai Performance Audit Tool - Update for Red Samurai Performance Audit Tool - v 2.4. Slow activations are recorded as well, this allows to measure activation event impact to your ADF application performance.

To prove lazy View Object instance activation scenario, I have developed sample application - Application Module is set to support only one referenced instance, this means we can force passivation/activation events with two Web sessions running in parallel:

AM data model contains two different View Object instances, this is to check if both will be activated on the same time:

Generic VO implementation class overrides activateCurrentRow method, to record activation event for View Object instance. Red Samurai Performance Audit Tool overrides the same method, to log slow activation, as this method is called at the end of activation event - when all records for the VO rowset were fetched:

On Task Flow side, I have created two fragments (each for VO instance from AM data model):

Each fragment is mapped with its own Page Definition file, this means each fragment have its own bindings:

To test provided sample application, make sure to set FINEST logging level for Generic VO implementation class:

Open application, first fragment with Employees data will be loaded:

Go to Departments fragment, to load Departments VO instance and navigate back to Employees:

Open second browser session, without closing the first one:

Referenced Pool Size is set to 1, this means AM instance from first session will be passivated. Now go back to the first session browser and change row selection - you will see in the log activation event for the Employees VO instance only:

Departments VO instance was not activated yet, as it belong to different Page Definition, it will be activated lazily - when data from second Page Definition will be loaded. Navigate to the Departments fragment, mapped with the second Page Definition:

Only when you navigate to Departments, and load data from Departments Page Definition - it activates Departments View Object instance - see new message printed in the log. This is quite smart and good functionality in terms of activation performance:

Simple JQuery Notification Message for ADF UI

Mon, 2014-02-17 09:44
If you would like to use JQuery in ADF and looking for some simple example, this post if for you. I'm sharing use case of JQuery notification message, displayed after successful commit operation is completed in ADF. Once commit is completed, notification message is displayed for 2 seconds and later it disappears.

Here you can see, how it looks. User successfully saved changed data - notification message is displayed about success - it nicely slides from the top. Message is displayed nicely on top of ADF UI, ADF UI components aren't pushed down:

In a case of failure during commit, for example if validation rule fails or DB constraint prevents data update, no notification about success is displayed - but only standard ADF error message:

JQuery notification message code is taken from the referenced source, we just updated it for better integration with ADF Faces, you can see entire code in the sample application -

JQuery Java Script library is integrated into ADF application fragment through ADF Faces resource component, just as any other Java Script code:

Notification message is displayed from ADF managed bean, action method. Firstly ADF commit operation is executed, if it returns error - no notification is displayed. Code to display notification message could be generic, as a part of your core framework structure:

I hope this simple example, would help to create even nicer UI's for CRUD applications, implemented with ADF.

Collecting Changed Row Keys in ADF BC DoDML Method

Wed, 2014-02-12 12:13
If you ever had a question - how to collect information about all changed rows in the transaction, I will provide an answer in this post. As you perhaps already know, doDML method from Entity Implementation class is invoked per each changed row. This means our task is to collect Primary Keys for all changed rows and later access this collection from another method, to be able to process collected keys (log them, call PL/SQL, etc.).

Sample application -, overrides doDML method and collects all changed rows for Employees EO. In real use case, this should be more generic, if you want to apply similar logic for different EO's, however you should follow the same concept as described here. In doDML I collect information about Primary Key programmatically, getting key value and saving it in collection. Later this collection is stored in ADF BC session user data. There is no need to passivate collection with changed rows Primary Keys, as we are going to access it in the same request from afterCommit method on VO level - there will be no passivation in between:

I'm using a trick to get information about Entity primary key programmatically. There is a method called getPrimaryKeys in Entity Definition class, but this method is marked as protected, so I can't access it from custom code. However, I can implement my own Entity Definition class, override getPrimaryKeys protected method inside custom public method - in such way, I will make it available for custom methods:

doDML is invoked for every changed row, we need a method to be called at the end, to access collected info. Such method could be - afterCommit in View Object Implementation. This method is called after all rows are processed in doDML. I'm getting information about Primary Key and accessing ADF BC user data variable, where Primary Keys for all changed rows are stored. In this example, I'm simply printing out keys for all changed rows:

Here we can see, how it works. Change data in multiple rows and press Save:

Information about Primary Key name and values for every row containing changes is printed out:

We could use Primary Key values to access changed rows, process attributes we are interested in and call PL/SQL if required, using attribute values as parameters.

Explaining Change Indicator Property for ADF BC Attribute

Fri, 2014-02-07 13:21
There is one not very visible, but quite powerful property available for ADF BC Entity Object attribute. This property is called - Changed Indicator. By default, during commit operation, ADF scans each changed attribute from the current row and compares value in the DB. If it locates changed values in DB, reports error about another user changes in the same row. While this is expected functionality, there are use case when we want to allow commit, even data was changed already by someone else. For example, in more complex systems, data is updated by PL/SQL procedures, we don't want to inform user with error about this. There is a way to override lock method in EO implementation class, catch lock error and raise lock again. This works, but there is different way - to use Change Indicator. This property defines specific attribute to be responsible for row data changes tracking, instead of checking every attribute. Only if Change Indicator attribute value will be changed in DB, then current row changes will be prevented and user will be informed about someone else changes in the same row.

My previous post - Different Approach for DB Constraint Error Handling in ADF, about handling DB constraint errors is using Change Indicator defined for Primary Key. In this way, I'm completely ignoring changes by other users and allowing to commit data no matter if it was changed by someone else.

Here you can download sample application with Change Indicator demo - Change Indicator can be set on EO attribute, I'm using Department Id in this example. This would mean, all changes are allowed without informing a user about new changes in DB, except when Department Id is changed:

I'm running sample application with two different sessions. Changing Salary value in both sessions, for the same row. Salary is changed in the first session, but not yet committed:

The same row Salary is changed from the second session and committed with Save button:

Committing from the first session, data in the same row, doesn't bring error about data changed by another user. As expected, since Change Indicator value was not changed - ADF doesn't know about changes in the DB:

If I would invoke rollback in the second session now, by pressing Cancel button - I can see that the last updated Salary attribute value is displayed:

We can simulate data change in the DB and trigger update in Change Indicator. Let's update Change Indicator column value and commit this change:

Go back to the ADF UI, change data in the same row and try to save it:

Change Indicator value was changed, we are going to get error about another user changes in the same row - as expected this time. Notice that Change Indicator value is updated, Department Id is set to 30:

As it works by default in this case, press Save button second time, data will be saved for the new Department Id:

Different Approach for DB Constraint Error Handling in ADF

Wed, 2014-02-05 13:25
Let's be honest - no matter how developer friendly and stable validation rules support would be in ADF BC, there will be always use cases when validation logic will be executed directly in DB, by check constraints for example. There is one problem in ADF, when validation logic is executed by DB check constraints. As there will be error received in doDML, while posting row and violating check constraint - transaction will stop and no other edited rows will be verified, until currently failed row data will be fixed. I will explain in this post, how to bypass such behaviour and have to report failed rows to ADF UI.

Key part to understand - there are two transient attributes defined on VO level. These two should be defined on VO level, if you would define them on EO level, transient attributes from EO level would not be included into passivation and values would be lost. One attribute is used as a flag to indicate DB constraint error for current row and other attribute keeps error text:

Both attributes are set to be included into passivation cycle. This would not be possible for transient attributes defined on EO level:

Method doDML is overriden on EO level to catch error from DB check constraint. In this example, I'm processing only Salary > 0 DB check constraint. In case of error from DB, error text is saved into ErrorText transient attribute from current row. ErrorIndicator is set to be 1, if error happened, and 0 otherwise. I'm getting current VO row for the current EO, by accessing it from root AM:

On UI side, we add Partial Trigger to the surrounding Panel Collection from Save/Cancel buttons. This will ensure error code and error text for the affected rows will be updated:

There are client and server listeners defined for the table to enable row double click support, this is how error message will be displayed to the user (instead of annoying popup showing up constantly):

Error message from currently selected row is retrieved and displayed through Faces message. I'm checking if error exists for the current row and only then display it - quite straightforward:

Here is example, where user was trying to submit changes in three rows, all set with negative salary. Of course DB check constraint for positive salary failed and we can see failed rows highlighted in red. You can double click on row highlighted in red and then error text popup will be displayed:

Let's fix salary in the second failed row to be positive and press Save again. Two rows will remain in failed state, but data from the fixed row will be saved to DB successfully:

We can double click on the first failed row to see the error message:

Fix salary to be positive in the first row and commit again - only one row with error will remain in pending state:

Download sample application -

AMStatistics - Adding Custom Statistics to ADF BC

Mon, 2014-01-27 05:08
ADF BC Application Module provides standard class called AMStatistics. This class gives you info about AM creation time as for example, Web Session ID assigned to the current session, etc. What is cool about this class - it is not limited with out of the box statics only, it allows to add your custom statistics. I will describe use case of tracking activation timestamp for the AM and keeping it custom AMStatistics variable.

Here you can download completed sample application - This sample overrides ADF BC method - activateState in Application Module Implementation class. Mehtod activateState is invoked by the framework, when activation event happens. At this time, we create new custom StatsInfo instance and save it in AMStatistics (it overrides previously saved statistic with the same name). Activation timestamp is saved into AMStatistics:

Custom method - checkLastActivationTime is accessing AMStatistics and reads AM activation time custom statistics saved earlier. This method is invoked from the button.

To see the log output, when checkLastActivationTime method is invoked, make sure to set INFO log level for the Application Module Implementation class:

For the test purposes, I have set Referenced Pool Size = 1, this means with two users active - it will start to passivate and activate Application Module instances and we would be able to see changed timestamp:

Here we can see output from the runtime test. When first user is activated, I press Check Last Activation Time button - timestamp is set and printed in the log:

Moving to the second user, activation is performed. Last activation time is updated:

Until Application Module instance remains assigned to the current user, activation time will remain unchanged - you can see this in the log:

This was an example of a use case, when AMStatistics can be used to keep custom statistic values.

Practical Example for ADF Active Data Service

Thu, 2014-01-23 13:42
I have created more complex and complete Active Data Service example, based on the one posted in the previous post - Simple Example for ADF Active Data Service. Updated sample application is using JDBC to listen for updates in the DB. Updates counter is refreshed through ADS and displayed to the user. This sample is tested with Oracle XE 10g, you only need to grant change notification to the the user connecting from the data source.

If you want to see more detail runtime output for the sample application -, make sure to enable following ADF logger classes:

I would like to start explaining, how Model and ADS parts communicate. There is interface defined, this acts as a bridge between DB push controller and ADS engine to push refresh changes to the UI:

ADS controller class (the one responsible for ADS push logic), implements defined interface:

Push mechanism is started automatically by ADS framework, framework method startActiveData is invoked. Inside this method, we create new instance of DB push controller and pass instance of ADS controller (the one implementing interface):

DB push controller in turn starts listener for DB change notifications. You must execute select from DB table, the one you want to listen:

ADS provides method, this method is invoked automatically when ADS is deregistered - we are stopping DB push controller from here:

Here I'm running the test now. There are two different session, I change data in the first session and save to the DB:

As this changes is saved to the DB, it is propagated across all sessions registered with ADS - we can see new change number is available:

I can click Synch Changes - to re-execute ADF BC and bring all the latest changes in the current session, you can see changes count is reset to 0. Do a change again and press Save:

You are going to see change incremented in the second browser session, as it was not synchronised previously. It means - we had two updates in the DB, and currently displayed data needs to be synched:

From the ADF log we can see that when new browser session is opened, firstly it starts new ADS thread and then registers DB change push controller with DB notification listener. DB changes notifications are logged. At the end, when browser page with ADS support is closed, ADS thread is stopped automatically, it stops DB push controller and deregisters DB notification listener:

Once user clicks on Synch Changes link, we call action listener from ADS controller bean, where we set latest synch time and reset counter by pushing this change through ADS update event:

Latest synch time variable is referenced from ADF task flow parameter:

ADF task flow is set to refresh - ifNeeded, this means it will refresh automatically each time, when parameter is changed (to synchronise changes from DB using ADF BC):

Update for Red Samurai Performance Audit Tool - v 2.4

Sun, 2014-01-19 10:02
This weekend we have finalized latest update for our ADF runtime performance audit tool - Red Samurai Performance Audit Tool v 2.4. You can read about features included into previous update v 2.3 in this post - Update for Red Samurai Performance Audit Tool - v 2.3. Current update v 2.4 is focused on Slow Query and Large Fetch drill down screens usability. We provide more detailed information to understand how your ADF application performs recently and how applied tuning improves performance.

List of improvements in v 2.4:

1. Improved nested Application Modules activation auditing

2. Improved first screen of the performance dashboard application. Types of Issues graph displays calculated total number of issues. There is option to filter issues by type:

3. Drill down screen for Slow Queries is more intelligent now. It displays latest issue occurrences (during 5 days), together with total occurrences over month. This allows to understand if applied tuning gives any result, also to distinguish new issues easier:

4. Previously Drill down screen for Slow Queries was displaying only number of issue occurrences. Now you are allowed to go even further and view the list of issues with SQL statements. Previously you could do this only from the first screen of the performance dashboard, now the same is enabled from drill down screens for your convenience:

5. Drill down screen for Large Fetches is enhanced in the same way as the one for Slow Queries. We can see recently logged issues, together with all logged issues over the last month:

6. Details for every issue occurrence can be viewed for each logged large fetch:

7. Logged Users graph was updated to display average number of users per hour, together with total number of users per day. This allows to understand better real workload during each day: