Skip navigation.

Feed aggregator

An OOW Summary from the ADF and MAF perspective

Shay Shmeltzer - Fri, 2014-10-03 12:39

Another Oracle OpenWorld is behind us, and it was certainly a busy one for us. In case you didn't have a chance to attend, or follow the twitter frenzy during the week, here are the key take aways that you should be aware of if you are developing with either Oracle ADF or Oracle MAF.

 Oracle Alta UI

We released our design patterns for building modern applications for multiple channels. This include a new skin and many samples that show you how to create the type of UIs that we are now using for our modern cloud based interfaces.

All the resources are at

The nice thing is that you can start using it today in both Oracle ADF Faces and Oracle MAF - just switch the skin to get the basic color scheme. Instructions here.

Note however that Alta is much more than just a color change, if you really want an Alta type UI you need to start designing your UI differently - take a look at some of the screen samples or our demo application for ideas.

Cloud Based Development

A few weeks before OOW we released our Developer Cloud Service in production, and our booth and sessions showing this were quite popular. For those who are not familiar, the Developer Cloud Service, gives you a hosted environment for managing your code life cycle (git version management, Hudson continuos integration, and easy cloud deployment), and it also gives you a way to track your requirements, and manage team work.

While this would be relevant to any Java developing team, for ADF developers there are specific templates in place to make things even easier.

You can get to experience this in a trial mode by getting a trial Java service account here.

Another developer oriented cloud service that got a lot of focus this year was on the upcoming Oracle Mobile Cloud Service - which includes everything your team will need in order to build mobile backends (APIs, Connectors, Notification, Storage and more). We ran multiple hands-on labs and sessions covering this, and it was featured in many keynotes too.

 In the Application development tools general session we also announced that in the future we'll provide a capability called Oracle Mobile Application Accelerator (which we call Oracle MAX for short) which will allow power users to build on device mobile applications easily through a web interface. The applications will leverage MAF as the framework, and as a MAF developer you'll be able to provide additional templates, components and functionality for those.

Another capability we showed in the same session was a cloud based development environment that we are planning to add to both the Developer Cloud Service and the Mobile Cloud Service - for developers to be able to code in the cloud with the usual functions that you would expect from a modern code editor.


The Developer Community is Alive and Kicking

The ADF and MAF sessions were quite full this year, and additional community activities were successful as well. Starting with a set of ADF/MAF session by users on the Sunday courtesy of ODTUG and the ADF EMG. In one of the sessions there members of the community announced a new ADF data control for XML. Check out the work they did!

ODTUG also hosted a nice meet up for ADF/MAF developers, and announced their upcoming mobile conference in December. They also have their upcoming KScope15 summer conference that is looking for your abstract right now!

Coding Competition

Want to earn some money on the side? Check out the Oracle MAF Developer Challenge - build a mobile app and you can earn prizes that range from $6,000 to $1,000.


With so many events taking place it sometime hard to hit all the sessions that you are interested in. And while the best experience is to be in the room, you might get some mileage from just looking at the slides. You can find the slides for many sessions in the session catalog here. And a list of the ADF/MAF sessions here.

See you next year. 

Categories: Development

LinkedIn Releases College Ranking Service

Michael Feldstein - Fri, 2014-10-03 09:57

I have long thought that LinkedIn has the potential to be one of the most transformative companies in ed tech for one simple reason: They have far more cross-institutional longitudinal outcomes data than anybody else—including government agencies. Just about anybody else who wants access to career path information of graduates across universities would face major privacy and data gathering hurdles. But LinkedIn has somehow convinced hundreds of millions of users to voluntarily enter that information and make it available for public consumption. The company clearly knows this and has been working behind the scenes to make use of this advantage. I have been waiting to see what they will come up with.

I have to say that I’m disappointed with their decision that their first foray would be a college ranking system. While I wouldn’t go so far as to say that these sorts of things have zero utility, they suffer from two big and unavoidable problems. First, like any standardized test—and I mean this explicitly in the academic meaning of the term “test”—they are prone to abuse through oversimplification of their meaning and overemphasis on their significance. (It’s not obvious to me that they would be subject to manipulation by colleges the way other surveys are, given LinkedIn’s ranking method, so at least there’s that.) Second and more importantly, they are not very useful even when designed well and interpreted properly. Many students change their majors and career goals between when they choose their college and when they graduate. According to the National Center for Education Statistics, 80% of undergraduates change their majors at least once, and the average student changes majors three times. Therefore, telling high schools students applying to college which school is ranked best for, say, a career in accounting has less potential impact on the students’ long-term success and happiness than one might think.

It would be more interesting and useful to have LinkedIn tackle cross-institutional questions that could help students make better decisions once they are in a particular college. What are the top majors for any given career? For example, if I want to be a bond trader on Wall Street, do I have to major in finance? (My guess is that the answer to this question is “no,” but I would love to see real data on it.) Or how about the other way around: What are the top careers for people in my major? My guess is that LinkedIn wanted to start off with something that (a) they had a lot of data on (which means something coarse-grained) and (b) was relatively simple to correlate. The questions I’m suggesting here would fit that bill while being more useful than a college ranking system (and less likely to generate institutional blow-back).

The post LinkedIn Releases College Ranking Service appeared first on e-Literate.

OCP 12C – Real-Time Database Operation Monitoring

DBA Scripts and Articles - Fri, 2014-10-03 09:34

What is Real Time Database Operation Monitoring ? Real Time Database Operation Monitoring will help you track the progress of a set of sql statements and let you create a report. Real Time Database Operation Monitoring acts as a superset of all monitoring components like : ASH, DBMS_MONITOR … You can generate Active Reports which are [...]

The post OCP 12C – Real-Time Database Operation Monitoring appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

SQL Patch: Another way to change the plan without changing the code

Yann Neuhaus - Fri, 2014-10-03 09:02

Recently, at a customer site, I faced a performance issue. However, as often the statement is embedded in the application so it's not possible to rewrite the query. In this blog post, we'll change the execution plan to solve the problem without changing the code - thanks to SQL Patch.

Log Buffer #391, A Carnival of the Vanities for DBAs

Pythian Group - Fri, 2014-10-03 08:04

Oracle Open World is in full bloom. Enthusiasts of Oracle and MySQL are flocking to extract as much knowledge, news, and fun as possible. SQL Server aficionados are not far behind too.


Frank Nimphius have announced REST support for ADF BC feature on OOW today. Probably this functionality will be available in the next JDeveloper 12c update release.

RMAN Enhancements New Privilege A new SYSBACKUP privilege is created in Oracle 12c,  it allows the grantee to perform BACKUP and RECOVERY operations with RMAN SQL in RMAN.

To continue with the objective of separating duties and the least privileges, Oracle 12c introduce new administrative privileges all destined to accomplish specific duties.

Unified Auditing offers a consolidated approach, all the audit data is consolidated in a single place. Unified Auditing consolidate audit records for the following sources.

SOA Suite 12c – WSM-02141 : Unable to connect to the policy access service.

SQL Server:

Data Compression and Snapshot Isolation don’t play well together, you may not see a performance benefit.

Tim Smith answers some questions on SQL Server security like: Is It Better To Mask At the Application Level Or The SQL Server Database Level?

Since SQL Server delivered the entire range of window functions, there has been far less justification for using the non-standard ex-Sybase ‘Quirky Update’ tricks to perform the many permutations of running totals in SQL Server.

Easily synchronize live Salesforce data with SQL Server using the Salesforce SSIS DataFlow Tasks.

Change All Computed Columns to Persisted in SQL Server.


Low-concurrency performance for point lookups: MySQL 5.7.5 vs previous releases.

How to get MySQL 5.6 parallel replication and XtraBackup to play nice together.

The InnoDB labs release includes a snapshot of the InnoDB Native Partitioning feature.

Visualizing the impact of ordered vs. random index insertion in InnoDB.

Single thread performance in MySQL 5.7.5 versus older releases via sql-bench.

Categories: DBA Blogs

Virtualbox: only 32bit guests possible even though virtualization enabled in BIOS / Intel Process Identification Utility shows opposite to BIOS virtualization setting

Dietrich Schroff - Fri, 2014-10-03 03:08
Virtualbox on my Windows 8.1 stopped running 64bit guests a while ago. I did not track down this problem. Now some months later i tried again and found some confusing things.
First setting:BIOS virtualization enabled
Intel Processor Identification Utlility in 8.1: virtualization disabled
Second setting
BIOS virtualization disabled
Intel Processor Identification Utlility in 8.1: virtualization enabledWith both settings: Virtualbox runs 32bit guests but no 64bit guests.

After some searching, i realized, what was happening:
I added Microsofts Hyper-V virtualization. With that enabled Windows 8.1 is no longer a real host. It is just another guest (the most important guest) on this computer. So with Hyper-V enabled i was trying to run Virtualbox inside an already virtualized Windows 8.1.
After that it was easy: Just disable Hyper-V on Windows 8.1:

And after a restart of Windows 8.1 i was able to run 64bit guests on Virtualbox again....

Java get Class names from package String in classpath

Yann Neuhaus - Fri, 2014-10-03 01:28

As a Java developer you probably used to know about reflexion. However, in order to keep your software architecture flexible, some functionalities are sometimes not provided out of the box by the JVM.

In my particular case, I needed to find out every Class and Sub-Classes inside a package, thus reparteed within several Jars.

Internet has lots of solution, but it remains complicated for everybody to reach this goal. After googleing, I found a link which provided a partial solution. I would like to thank the website author:

Some other solution invited us to deploy external libraries as well. But I was not interested to manage another lib in my soft just for that purpose.

So, the solution was to recover all jars from the context classloader and loop on them in order to find out the classes we are looking for.

Following, you will see a complete Java class resolving this issue:








import java.util.ArrayList;

import java.util.Enumeration;

import java.util.HashMap;

import java.util.List;

import java.util.jar.JarEntry;

import java.util.jar.JarFile;






 * @author Philippe Schweitzer dbi services Switzerland



public class ClassFinder {


    public static void main(String[] args) throws ClassNotFoundException {


        List<Class> classes = ClassFinder.getClassesFromPackage("YOUR PACKAGE NAME");


        System.out.println("START ClassList:");

        for (Class c : classes) {

            System.out.println(c.toString());// + " " + c.getCanonicalName());


        System.out.println("END ClassList:");





     * Attempts to list all the classes in the specified package as determined     *

     * by the context class loader...


     * @param pckgname the package name to search

     * @return a list of classes that exist within that package

     * @throws ClassNotFoundException if something went wrong



    public static List getClassesFromPackage(String pckgname) throws ClassNotFoundException {


        ArrayList result = new ArrayList();

        ArrayList<File> directories = new ArrayList();

        HashMap packageNames = null;


        try {

            ClassLoader cld = Thread.currentThread().getContextClassLoader();

            if (cld == null) {

                throw new ClassNotFoundException("Can't get class loader.");



            for (URL jarURL : ((URLClassLoader) Thread.currentThread().getContextClassLoader()).getURLs()) {

                System.out.println("JAR: " + jarURL.getPath());


                getClassesInSamePackageFromJar(result, pckgname, jarURL.getPath());

                String path = pckgname;

                Enumeration<URL> resources = cld.getResources(path);


                File directory = null;


                while (resources.hasMoreElements()) {

                    String path2 = resources.nextElement().getPath();

                    directory = new File(URLDecoder.decode(path2, "UTF-8"));




                if (packageNames == null) {

                    packageNames = new HashMap();


                packageNames.put(directory, pckgname);



        } catch (NullPointerException x) {

            throw new ClassNotFoundException(pckgname + " does not appear to be a valid package (Null pointer exception)");


        } catch (UnsupportedEncodingException encex) {

            throw new ClassNotFoundException(pckgname + " does not appear to be a valid package (Unsupported encoding)");


        } catch (IOException ioex) {

            throw new ClassNotFoundException("IOException was thrown when trying to get all resources for " + pckgname);




        for (File directory : directories) {

            if (directory.exists()) {

                String[] files = directory.list();


                for (String file : files) {

                    if (file.endsWith(".class")) {

                        try {

                      //      System.out.println(packageNames.get(directory).toString() + '.' + file.substring(0, file.length() - 6));


                            result.add(Class.forName(packageNames.get(directory).toString() + '.' + file.substring(0, file.length() - 6)));

                        } catch (Throwable e) {




            } else {

                throw new ClassNotFoundException(pckgname + " (" + directory.getPath() + ") does not appear to be a valid package");




        return result;






     * Returns the list of classes in the same directories as Classes in

     * classes.


     * @param result

     * @param classes

     * @param jarPath



    private static void getClassesInSamePackageFromJar(List result, String packageName, String jarPath) {


        JarFile jarFile = null;


        try {

            jarFile = new JarFile(jarPath);

            Enumeration<JarEntry> en = jarFile.entries();


            while (en.hasMoreElements()) {

                JarEntry entry = en.nextElement();

                String entryName = entry.getName();

                packageName = packageName.replace('.', '/');


                if (entryName != null && entryName.endsWith(".class") && entryName.startsWith(packageName)) {

                    try {

                        Class entryClass = Class.forName(entryName.substring(0, entryName.length() - 6).replace('/', '.'));


                        if (entryClass != null) {



                    } catch (Throwable e) {

// do nothing, just continue processing classes




        } catch (Exception e) {


        } finally {

            try {

                if (jarFile != null) {




            } catch (Exception e) {





OOW14 Update: Oracle OpenWorld 2014 comes to an end

Javier Delgado - Fri, 2014-10-03 00:11
Today was the last day of Oracle OpenWorld 2014 at San Francisco. Even though it started a bit later due to yesterday's Appreciation Event which hosted Aerosmith, Spacehog and Macklemore & Ryan Lewis (which I did not attend, but that's a different story), the day was packed with good sessions. I have particularly appreciated the PeopleTools Meet the Experts session, which allowed me to network with Oracle PeopleTools experts and share points of view with other partners and customers.

From a PeopleSoft perspective, the event has produced some news, but actually nothing unexpected or that was not rumoured on the internet in the latest weeks. Here is a summary of the news that I found more interesting (*):

  • Fluid interface was the hottest topic from a PeopleSoft standpoint. As previously seen on this blog, Oracle announced the availability of the first applications in the coming days.
  • Fluid is initially intended for casual and executive users, but there is a plan to extend it to the power users. Under my point of view, not only the interface would need to improve a bit in order to achieve that, but also the development should be somehow simplified, as currently designing Fluid pages requires more effort than traditional PIA pages.
  • These are features in the roadmap for the Fluid interface: wizards for tile creation, related contents, activity guides and master/detail page template.
  • Oracle has no plans to deliver PeopleSoft 9.3. Still, this does not mean that they will stop investing on PeopleSoft (read more).
  • I was nicely surprised by the interest shown by attendants for the PeopleSoft Test Framework sessions. This tool has been around for a while, but the customer adoption has been slow. The new Continuous Delivery Model may bring some interest to this tool, as testing should become more iterative.
  • On the architecture side, the ability to use the new in-memory features of Oracle DB 12c under PeopleTools 8.54 brings unprecedented performance to PeopleSoft environments. Still, you would need to dedicate a minimum of 100 Gb of memory to the in-memory part of the database SGA, but if you have the money, it seems worth going for it.

This has been a very interesting and intense week. Now, a few days to rest and return home, and then back to work with some new perspectives and ideas.

(*) Keep in mind Oracle's Safe Harbor statement, which practically says that what was presented during the sessions does not express a commitment from Oracle.

REST Support for ADF BC in 12c

Andrejus Baranovski - Thu, 2014-10-02 18:45
Frank Nimphius have announced REST support for ADF BC feature on OOW today. Probably this functionality will be available in the next JDeveloper 12c update release.

Once REST will be enabled for Application Module, new XML definition file and project will be created. Here you can see how new wizard will look like for REST definition on top of ADF BC:

Multi Sheet Excel Output

Tim Dexter - Thu, 2014-10-02 17:28

Im on a roll with posts. This blog can be rebuilt ...

I received a question today from Camilo in Colombia asking how to achieve the following.

‘What are my options to deliver excel files with multiple sheets? I know we can split 1 report in multiple sheets in with the BIP Advanced Options, but what if I want to have 1 report / sheet? Where each report in each sheet has a independent data model ….’

Well, its not going to be easy if you have to have completely separate data models for each sheet. That would require generating multiple Excel outputs and then merging them, somehow.

However, if you can live with a single data model with multiple data sets i.e. queries that connect to separate data sources. Something like this:

Then we can help. Each query is returning its own data set but they will all be presented together in a single data set that BIP can then render. Our data structure in the XML output would be:


Three distinct data sets within the same data output.

To get each to sit on a separate sheet within the Excel output is pretty simple. It depends on how much native Excel functionality you want.

Using an RTF template you just create the layouts for each data set on a page(s) separated by a page break (Ctrl-Enter.) At runtime, BIP will place each output onto a separate sheet in the workbook. If you want to name each sheet you can use the <?spreadsheet-sheet-name: xpath-expression?> command. More info here. That’s as sophisticated as it gets with the RTF templates. No calcs, no formulas, etc. Just put the output on a sheet, bam!

Using an Excel template you can get more sophisticated with the layout.

This time thou, you create the layout for each data model on separate sheets. In my example, sheet 1 holds the department data, sheet 2, the employee data and so on. Some conditional formatting has snuck in there.

I have zipped up the sample files here.


Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Calibri; mso-bidi-theme-font:minor-latin;}
Categories: BI & Warehousing

OOW14 Day 4 - Internals of the 12c Multitenant Architecture by Vit Spinka

Yann Neuhaus - Thu, 2014-10-02 14:47

This is the session I preferred at Oracle Open World. Well, I'm writing that before going to @ludodba one and I'm sure I'll have then two preferred sessions... So Vit Spinka has presented the internals of the new multitenant architecture. It's always good to play with some internals things. Not only for the geeky fun of it but also because it helps understand how it work and address issues later.

I had investigated the metadata/object links in my blog post (and thanks to Vit for having referenced it during his presentation). But I learned from vit about what has changed in redo logs. In his case, the research on redo log internals is not just a curiosity. It's mandatory for his job: he is the principal developer for Dbvisit Replicate and Dbvisit Replicate reads the redo logs: the MINER process reads them and transforms them to something that can be used by the APPLY process.

So I'll not repeat what is available in his slides: 

Finally the redo is quite the same except that it adds the container id (it's one byte only because we are limited to 252 PDB). Addressing the files is not very special as the pluggable is very similar to transportable tablespaces. Addressing the objects is a bit different because we can have same object id across several PDB, and this is the reason to introduce the container id in the redo. But that's not so special.

The thing to remember about the new multitenant architecture is that:

  • it is not a big change for the instance which still manages the same objects (sessions, cursors, services, buffers, etc) just adding a container id
  • it no change for the database files as transportable tablespaces already introduced 'plugging'
  • all the magic is in the dictionary in order to have a shared dictionary for oracle objects anda  private dictionary for application objects. 

OCP 12C – RMAN and Flashback Data Archive

DBA Scripts and Articles - Thu, 2014-10-02 14:36

RMAN Enhancements New Privilege A new SYSBACKUP privilege is created in Oracle 12c,  it allows the grantee to perform BACKUP and RECOVERY operations with RMAN SQL in RMAN You can now use SQL Statements in RMAN like you would do in SQL*PLUS : BEFORE : RMAN> SQL “alter system switch logfile”; NOW : RMAN> alter system switch logfile; [...]

The post OCP 12C – RMAN and Flashback Data Archive appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

Oracle Priority Support Infogram for 02-OCT-2014

Oracle Infogram - Thu, 2014-10-02 13:59

What do businesses need to prepare for cloud migration?

Chris Foot - Thu, 2014-10-02 13:47

Whether to host applications or increase storage, migrating workloads to cloud environments is a consistent trend. However, many database support services are discovering that businesses unfamiliar with the technology often don't know where to begin. 

It appears more enterprises will need guidance in the near future. Business Cloud News conducted a survey of 312 IT professionals across the United Kingdom, Europe and North America, finding 40 percent of participants believe 30 to 70 percent of their IT assets will be hosted in the cloud in the next two years. 

So, what are some pain points interested parties should be cognizant of? 

1. A lack of in-house capabilities 
It's a point organizations have made in the past, but still deserves acknowledgement. Although in-house IT staff members are capable of sanctioning the transition from on-premise systems to a cloud environment, many require extensive instruction before they can do so. Even after training is completed, their lack of experience will likely cause interruptions. 

In this regard, outsourcing is a safe choice. Hiring remote DBA experts to work with existing teams to migrate all applications and storage to a cloud infrastructure will expedite the process while also ensuring long-term issues don't persist. 

2. Look at what applications are connected to 
Hybrid cloud deployments are quite common among organizations that want to host a portion of their it assets in the cloud, but retain full back-end control over critical applications. 

Suppose a company leverages that leverages a hybrid environment wants to transition its enterprise resource management solution to a hosted atmosphere. However, the ERP's file stores reside in on-premise servers. In order for the ERP solution to undergo migration, the file stores it depends on to operate must be relocated beforehand. 

3. Observe indirect connections
Some on-premise deployments may seem alienated from other implementations but encounter hindrances when operating in the cloud. TechTarget noted one example detailed by Robert Green, principal cloud strategist at IT consultancy Enfinitum, who stated one of the firm's clients migrated an application to a public cloud environment without conducting a thorough assessment prior to initiation. 

What the company failed to recognize was that on-premise firewalls that assessed and filtered Internet traffic would directly impact its employees' ability to access the cloud-hosted application. When 400 users attempted to use the software, the firewalls became overloaded. In the end, the Enfinitum client lost $10 million because its workers were unable to use the application. 

If these three points are carefully considered, enterprises will be successful in all their cloud migration endeavors. 

The post What do businesses need to prepare for cloud migration? appeared first on Remote DBA Experts.

OCP 12C – Privileges

DBA Scripts and Articles - Thu, 2014-10-02 13:27

User Task-Specific Administrative Privileges To continue with the objective of separating duties and the least privileges, Oracle 12c introduce new administratives privileges all destinated to accomplish specific duties: SYSBACKUP : Used for RMAN operations like BACKUP, RESTORE, RECOVER SYSDG : Used to administer DATAGUARD, In 12c when you use DGMGRL commandline interface your are automatically [...]

The post OCP 12C – Privileges appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

Microsoft Hadoop: Taming the Big Challenge of Big Data – Part Two

Pythian Group - Thu, 2014-10-02 11:13

Today’s blog post is the second in a three-part series with excerpts from our latest white paper, Microsoft Hadoop: Taming the Big Challenge of Big Data. In our first blog post, we revealed just how much data is being generated globally every minute – and that it has doubled since 2011.

Traditional database management tools were not designed to handle the elements that make big data so much more complex—namely its key differentiators: volume, variety, and velocity.Variety Volume Velocity graphic

Volume is the quantity of data, variety refers to the type of data collected (image, audio, video, etc.), and velocity is its expected growth rate. Many people assume that big data always includes high volume and intuitively understand the
challenges that it presents. In reality, however, data variety and velocity are much more likely to prevent traditional management tools from being able to efficiently capture, store, report, analyze, and archive the data, regardless of volume.

Download our full white paper which explores the technical and business advantages of effectively managing big data, regardless of quantity, scope, or speed.


Categories: DBA Blogs

Oracle Technology Network Wednesday in Review / Thursday Preview - Oracle OpenWorld and JavaOne

OTN TechBlog - Thu, 2014-10-02 10:20

Annual Blogger Meetup was a hoot!  Thanks for joining us.

OTN Lounge activities come to a close today last chance to learn more about the Oracle ACE Program and the Oracle Community from the program leads.  See more below -

Oracle ACE Program – 11:30 to 1:30 - Oracle ACE Program Recognizes prominent advocates. Learn how to become an Oracle ACE, Gain community recognition for sharing your knowledge and expertise, Advance your career and network with like-minded peers

Oracle Community - 11:30 to 3:30 - Learn about the Oracle Technology Network Community Platform, and get a preview of the new badges that are coming soon! Get answers to questions, network with peers, and be rewarded for your expertise in the Oracle Community

Don't forget the OTN team has been busy shooting video and attending sessions.  See what they've been up to so far -

Blogs -
The Java Source Blog
OTN DBA/DEV Watercooler

YouTube Channels -
OTN Garage
OTN ArchBeat

Follow @JavaOneConf for conference-specific announcements

See you next year!

OCP 12C – Auditing

DBA Scripts and Articles - Thu, 2014-10-02 09:57

Unified Audit Data Trail Unifed Auditing offers a consolidated approach, all the audit data is consolidated in a single place. Unified Auditing consolidate audit records for the following sources : Standard Auditing Fine-grained auditing (DBMS_FGA) RAC security auditing RMAN auditing Database Vault auditing Oracle Label Security auditing Oracle Data Mining Oracle Data Pump Oracle SQL*Loader In [...]

The post OCP 12C – Auditing appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

Kuali Foundation: Clarification on future proprietary code

Michael Feldstein - Thu, 2014-10-02 08:35

Well that was an interesting session at Educause as described at Inside Higher Ed:

It took the Kuali leadership 20 minutes to address the elephant in the conference center meeting room.

“Change is ugly, and change is difficult, and the only difference here is you’re going to see all the ugliness as we go through the change because we’re completely transparent,” said John F. (Barry) Walsh, a strategic adviser for the Kuali Foundation. “We’re not going to hide any difficulty that we run into. That’s the way we operate. It’s definitely a rich environment for people who want to chuck hand grenades. Hey, have a shot — we’re wide open.” [snip]

Walsh, who has been dubbed the “father of Kuali,” issued that proclamation after a back-and-forth with higher education consultant Phil Hill, who during an early morning session asked the Kuali leadership to clarify which parts of the company’s software would remain open source.

While the article describes the communication and pushback issues with Kuali’s creation of a for-profit entity quite well (go read the whole article), I think it’s worth digging into what Carl generously describes as a “back-and-forth”. What happened was that there was a slide describing the relicensing of Kuali code as AGPL, and the last bullet caught my attention:

  • AGPL > GPL & ECL for SaaS
  • Full versions always downloadable by customers
  • Only feature “held back” is multi-tenant framework

If you need a read on the change of open source licenses and why this issue is leading to some of the pushback, go read Chuck Severance’s blog post.

Does ‘held back’ mean that the multi-tenant framework to enable cloud hosting partially existed but is not moving to AGPL, or does it mean that the framework would be AGPL but not downloadable by customers, or does it mean that the framework is not open course? That was the basis of my question.

Several Kuali Foundation representatives attempted to indirectly answer the question without addressing the license.

“I’ll be very blunt here,” Walsh said. “It’s a commercial protection — that’s all it is.”

The back-and-forth involved trying to get a clear answer, and the answer is that the multi-tenant framework to be developed / owned by KualiCo will not be open source – it will be proprietary code. I asked Joel Dehlin for additional context after the session, and he explained that all Kuali functionality will be open source, but the infrastructure to allow cloud hosting is not open source.

This is a significant clarification on the future model. While Kuali has always supported an ecosystem with commercial partners that can offer proprietary code, this is the first time that Kuali itself will offer proprietary, non open source code.[1]

What is not clear is whether any of the “multi-tenant framework” already exists and will be converted to a proprietary license or if all of this code will be created by KualiCo from the ground up. If anyone knows the answer, let me know in the comments.

From IHE:

“Unfortunately some of what we’re hearing is out of a misunderstanding or miscommunication on our part,” said Eric Denna, vice president of IT and chief information officer at the University of Maryland at College Park. “Brad [Wheeler, chair of the foundation’s board of directors,] and I routinely are on the phone saying, ‘You know, we have day jobs.’ We weren’t hired to be communications officers.”

Suggestion: Simple answers such as “What ‘held back’ means is that the framework will be owned by KualiCo and not open source and therefore not downloadable” would avoid some of the perceived need for communication officers.

  1. Kuali Foundation is partial owner and investor in KualiCo.

The post Kuali Foundation: Clarification on future proprietary code appeared first on e-Literate.

Day 3 at Oracle Open World 2014 - Cloud: Private or Public?

Yann Neuhaus - Thu, 2014-10-02 07:16

One of the main subject of this year's Oracle OpenWorld was the Cloud. In this post I will share some thoughts on this: is the Cloud a dream, a reality, fog or smoke?



Before going to OOW 2014 I did not have a fixed position about Cloud. I had some questions, for instance about security like, I guess, other people. I saw several sessions and I started to write this blog two days ago to summarize my reflection. Fortunately I was in a session on Wednesday where the discussion was "Private Cloud? Public Cloud?" and after that I had to update, in the good way, this post.



Now as we can choose between public, private or even mixed Cloud the sky is less .... cloudy.

I am a bit more convinced. I would use Public Cloud for the development environment as this will reduce the implementation time, flexibility for the developer. Why not also for the training as we do not have to use production data? For the QA, clone and production I would more use the Private Cloud for the time being. In both cases we will benefit from great agility, low expenses, better utilization of resources aso.

But there are still some questions:

  • Although we can install the Private Cloud like the Public one, what will be the impact of the budget? 

  • What will be the impact on our day to day work?

  • What will be our role as an integrator working more on the infrastructure layer?

  • Do we have a view and can we have some valuable discussions with the people who manage our system in the Public Cloud in case we hit issues? Can a good relationship be build?

  • Today we increase our skills while we are working also in Dev on the premise installation. We can hit sometimes, during the integration phase, issues that have to be solved. This of course avoid to have later the same problems in the other environments. How will this work in case we use a mixed environment, Public for Dev, Private for Prod?

  • Who will do the load&stress tests in case the Public Cloud is used?

  • In a validated system we have to follow strict procedures (e.g. GxP). How is this managed in the Public Cloud? Are we still compliant? Are the people managing the Public Cloud trained using the company's procedures? The last session confirmed that in that case we have to use the Private Cloud.

  • Besides the regulatory rules in the different countries, what will be the acceptance in the different countries? Will it be the same in Europe as in the USA? Do we have the same culture about this subject?

  • Another question which is not related to the technology; how the future young IT people have to be positionned? Which kind of training they have to follow? Is this already in place in the universities? Are the teacher aware of those changes?

I've got lots of information on the technical side but I am also interested on the social and human point of view. It would be interesting to have the view(s) from a sociologist or philosopher as this new approach will certainly have an impact on our life - like the Big Data will have.

Probably I missed some information in all this flow and I don't have all keys. But I think we are at the beginning of a revolution? evolution? or opportunity.

Let's see what will happen in the next few years, and think about it in the meantime...