Skip navigation.

Development

Oracle APEX Cookbook: Second Edition

Dimitri Gielis - Wed, 2014-06-11 14:24
For the first Oracle APEX Cookbook I was involved as a reviewer.

Michel and Marcel updated their book end of last year, but I didn't take the time to blog about it yet - and months fly. The concept stayed the same as the first edition, but it got updated with the latest info for APEX 4.x.

"People who followed a beginner training or learned APEX at their own and they want to know how to do a specific thing which is covered in the book, it's great to have the book, as you can just follow what the authors wrote and you also have an idea why it's done like that."

If you need onsite Oracle APEX training, you can also contact my company APEX R&D :)
Categories: Development

Social Authentication (Facebook) in WC2014Challenge

Dimitri Gielis - Tue, 2014-06-10 15:33
People expect these days from a public website you can authenticate with Facebook, Google+, Linkedin, Microsoft etc. It's very convenient as you don't need to create a specific account per website.

Background

All of the social networks have very good documentation how to call their APIs.
Here's for example the Facebook Login explained.

Most of the API's use the OAuth2.0 protocol, there's an application key and tokens that are send with the requests. Here's an overview how it works with Google+


So how easy is it in Oracle Application Express (APEX) to do such social authentication?

Unfortunately Oracle APEX doesn't provide us with a native social authentication mechanism just yet. But nothing prevents you from building it yourself.

Here are the options I reviewed:

  • Custom build; in PL/SQL you call the different url's and make some procedures public so when the social network comes back you can intercept the call and move on.
  • Oracle REST Data Services supports OAuth 2.0 and the calls are mostly REST calls, so I also looked into writing the logic in ORDS (and PL/SQL) and integrate that way with my APEX application.
  • Some people in the community wrote an authentication plugin which does the hard work for you.

I went with a combination of the Facebook plugin in combination with my own PL/SQL code.
Peter was so nice to share his work with me, thanks again for that Peter. I first thought that the authentication plugin would be plug-and-play, just like the other APEX plugins... but that is not the case.
It hasn't much to do the way Peter's team implemented it, it has more to do with the complex setup of SSL certificates etc. So when downloading the plugin, know that it will take some time to configure it. Luckily Peter provides good documentation so it makes it a bit easier.

So, to see the authentication to work, login with your Facebook account on the wc2014challenge.com site. I extended the plugin a bit so it will automatically create a site account for you behind the scenes so regardless if you create a site account or login with Facebook it can hook up the scores, bets etc. in the same way.

Challenges with Social Authentication

If you want to provide Facebook, Google+, LinkedIn and a normal site account in your app, I found some challenges with that. How do you hook-up a person that logs-in with Facebook the first time, with the same player logging in with Google+ the same time? You could use the email address maybe? But what if they use different ones? There are many blog posts about this topic and how to get around it, but it would bring me to far in this post. I might do a follow-up post later as it's an interesting challenge.

Future 

I really believe that most public sites will allow social authentication, so I hope the team of ORDS or the APEX development team will make something available to do the social authentication natively in the future. I believe that would be the best solution (fast to implement and secure).

Categories: Development

Security Audit of WC2014Challenge

Dimitri Gielis - Mon, 2014-06-09 15:34
A few weeks ago I asked my friends at RecX to do a security audit of the World Cup 2014 Challenge app.  The result was a security assessment document which explained what they tested, an explanation why it was important and the results they found. I found it very interesting to see how other (security) people approach your code.

Here are the areas they went into:

Access Control
  • Hidden items
  • Item Protection
  • Page Access Protection
Configuration
  • Session Timeout
Cross-Site Scripting 
  • Column From LOV/Query (make use of )
  • Direct Output
  • Indirect Output
  • Report Column Display Type
  • Template Variables
Tip: make use of apex_escape.html, apex_escape.html_attribute, utl_url.escape
Data Protection 
  • Page Autocomplete
Tip: Ensure sensitive data is not held in the browser cache

Warnings
  • Direct URL
You can read more about security in their Hands-On Oracle Application Express Security book.

Thanks Nathan and Tim.
Categories: Development

Telling Tales

Greg Pavlik - Sun, 2014-06-08 17:50
After struggling to find time for many months, I finally was able to sit down and watch without interruption Benjamin Bagby's Beowulf performance - an adaptation that relies on Bagby's voice and a reconstruction of a 6th century 6 tone Anglo-Saxon harp. The performance is engrossing and provokes a strong imaginative response, one that would have been communally experienced. Of course the only way to revive a sense of communal experience in the case of Bagby is to see him perform live - however, given the performance is entirely in Old English and as such most unintelligible without subtitles, I think a digital adaptation may be a necessary tradeoff. In many ways, Bagby's Beowulf is a reminder of how impoverished our notion of entertainment is - ephemeral, base, isolating and essentially throw away as a rule.

By the way, it's not entirely the case that the English are unable to create something of the same texture today - several times during the performance I thought of Judith Weir's one person, unaccompanied opera King Harald's Saga. Weir's work is much shorter, principally a musical composition and less poetically rich, so it is difficult to compare the two directly: Beowulf remains the provenance of a balladeer first and foremost, and this is a genre that more and more feels lost to our world - poetry today rarely seems to be meant to be read allowed and even more rarely follows epic formats. This is a lost social phenomena, for which we are impoverished: in fact, the last long work of a balladeer I read was Ethiopian Enzira Sebhat, itself a medieval work dedicated to the Virgin Mary. The closest - though only indirectly comparable - works to the Enzira Sebhat that I am aware of currently being composed are akathistos hymns of the Russian Orthodox tradition. And while many of those recent compositions are less-than-accomplished literary works, they unquestionably represent a rich and living and at times very beautiful means of transmission of communal memory and values. I am not aware of any recent akathistos compositions that have the expressive beauty and originality of the Byzantine hymnographer Romanos the Melodist, the modern akathist has sometimes proven a source of inspiration for exceptionally great art: the late Sir John Tavener's setting of the "thanksgiving akathist" being perhaps the most significant case in point.

Automatic Time Zone Support in WC2014Challenge

Dimitri Gielis - Tue, 2014-06-03 11:27
How do you show to people in different timezones the schedule in their own time?

That is the issue I had when building the wc2014challenge.com site.

So I started to just show the schedule in the "local time" of the stadium the match was in, so I didn't have to deal with the issue :)

But as you might think, people started to ask to see the schedule in their own time.

In previous years I solved the issue by adding a select list, so people could select the timezone they wanted to see the game in. Behind the scenes I reran the query and added the offset to the time - that worked just fine. Now the challenge this year is that the Brazilian timezone exist out of two timezones, so I couldn't really use the mechanism of before.

In the Oracle database, instead of a date column, you can use a timestamp with timezone column and can better calculate the difference. Another way is to use the "timestamp with local timezone", so you see the data in your timezone (after alter session set time_zone = your timezone).

Instead of doing the timezone conversion, I also thought of doing it on the client (browser), with momentjs for example.

They all have advantages and disadvantages... but at the end I decided to use the native APEX way.

Step 1: make sure your column is of type TIMESTAMP WITH LOCAL TIME ZONE:


Step 2: set Automatic Time Zone to Yes in "Edit Globalization Attributes" (Edit definition of your app).


And you are done!

Looking at the schedule it shows the times in my timezone, automatically. The nice thing is that this is cross application, so the calendar automatically shows the times in your timezone too. Very, very nice - no additional code.


So it's very easy to make your APEX application time zone aware... the only drawback I find is that this solution requires a redirect the first time you hit the site. That is not really good for Google rankings, but the advantages weight way more than that for now.

You can also read Joel's blog about automatic time zone in APEX 4.0, he build another example which you can follow.

Categories: Development

Em12c Creating Metric Extension(User Defined Metrics) for BPEL Process State OFF

Arun Bavera - Tue, 2014-06-03 11:14

 

clip_image002

clip_image004

clip_image006

clip_image008

clip_image010

 

select Domain_ref,process_id,state  from ORABPEL.PROCESS where state=1;

 

 

clip_image012

clip_image014

clip_image016

1) Save as Deployable Draft

2) Publish Metric Extension

3) Deploy To Targets : Cluster Database

4) Add this metric to your incident Rules to get alerts

clip_image018

clip_image020

Categories: Development

Memories of the way we were...

Greg Pavlik - Sat, 2014-05-31 15:13
The fascinating thing about Hadoop is the obviousness of its evolutionary needs. For example, MapReduce coupled with reliable scale out storage was a powerful - even revolutionary - effect for organizations with both lots of and multi-structured data. Out of the gate, Hadoop unlocked data "applications" that were for all intents and purposes unimplementable. At the same time, it didn't take much imagination to see that separating the compute model from resource management would be essential for future applications that did not fit well with MapReduce itself. It took a lot of work and care to get YARN defined, implemented and hardened, but the need for YARN itself was fairly obvious. Now it is here and Hadoop is no longer about "batch" data processing.

Note, however, it takes a lot of work to make the evolutionary changes available. In some cases, bolt on solutions have emerged to fill the gap. For key value data management, HBase is a perfect example. Several years ago, Eric Baldeschwieler was pointing out that HDFS could have filled that role. I think he was right, but the time it would take to get "HBase-type" functionality implemented via HDFS would have been a very long path indeed. In that case, the community filled the gap with HBase and it is being "back integrated" into Hadoop via YARN in a way that will make for a happier co-existence.

Right now we are seeing multiple new bolt on attempts to add functionality to Hadoop. For example, there are projects to add MPP databases on top of Hadoop itself. It's pretty obvious that this is at best a stop gap again - and one that comes at a pretty high price - I don't know of anyone that seriously thinks that a bolt on MPP is ultimately the right model for the Hadoop ecosystem. Since the open source alternatives look to be several years away from being "production ready", that raises an interesting question: is Hadoop evolution moving ahead at a similar or even more rapid rate to provide a native solution - a solution that will be more scalable, more adaptive and more open to a wider range of use cases and applications - including alternative declarative languages and compute models?

I think the answer is yes: while SQL on Hadoop via Hive is really the only open source game in town for production use cases - and its gotten some amazing performance gains in the first major iteration on Tez that we'll talk more about in the coming days - its clear that the Apache communities are beginning to deliver a new series of building blocks for data management at scale and speed: Optiq's Cost Based Optimizer; Tez for structuring multi-node operator execution; ORC and vectorization for optimal storage and compute; HCat for DDL. But what's missing? Memory management. And man has it ever been missing - that should have been obvious as well (and it was - one reason that so many people are interested in Spark for efficient algorithm development).

What we've seen so far has been two extremes available when it comes to supporting memory management (especially for SQL) - all disk and all memory. An obvious point here is that neither is ultimately right for Hadoop. This is a long winded intro to point to two, interrelated pieces by Julian Hyde and Sanjay Radia unveiling a model that is being introduced across multiple components called Discardable In-memory Materialized Query (DIMMQ). Once you see this model, it becomes obvious that the future of Hadoop for SQL - and not just SQL - is being implemented in real time. Check out both blog posts:

http://hortonworks.com/blog/dmmq/

http://hortonworks.com/blog/ddm/


Do animals have souls?

FeuerThoughts - Thu, 2014-05-29 08:04
OK, first of all, don't tell me your answer to this question. That would make the rest of this post seem a bit rude.

Here is one of the dumbest questions I can ever imagine a person asking, much less answering:
Do animals have souls?
How utterly ridiculous.
No one knows what a soul is. No one knows what it looks like, what it means, whether or not it really exists.
Furthermore, we certainly have no idea - please allow me to repeat that because I think it is so fundamental to accept this as fact: we have no idea at all - of what is going on inside an animal’s head. Clearly, a whole lot is going on, if you take the time to pay attention to animals and think about what it takes to do what they do. But many of the things humans blithely state as fact regarding animals, such as “They don’t know the difference between right and wrong.” is fundamentally meaningless because we simply cannot know what is going on inside another creature’s mind. We just make the assumption that they are really super different from us in all the ways that matter - to us.
We are intelligent, moral, sentient. We are smart and they are dumb, brute animals. We are conscious, we have history, philosophy, nuclear power. What do animals have? Nothing!
Oh really? How do we know what animals have? Or even what “have” means to a butterfly or a snake or a black bear? Again, we really have no idea whatsoever what animals have, what they want, or how they would feel about killing others just to make themselves comfortable (something that we humans do every second of every day).
So we make the most self-serving assumption imaginable. We simply outright declare that other creatures have no souls, are not sentient. They are food or threat or benign, but they are not like us.
We will continue to reject the evidence of our senses, the clear demonstrations of sentience, of complex social structures, in other animals. That way we don’t have to feel bad about enslaving them and killing them. Think for just a moment about how smart pigs are, and then think about pig farms in which tens of thousands of these poor creatures live short miserable lives - brought into this world for the very purpose of slaughtering them for bacon. And then later a dam bursts and an entire town is swamped with pig feces from the refuse lake at the farm. Go, humans, go!
I sure am glad there wasn’t and isn’t a species of creature on this planet that's three times our size, extremely powerful and licking its lips at the prospect of a nicely smoked human torso. 
We do not know what goes on inside a pig’s head, but it sure seems like they can feel and express terror. 
So, yes, humans will keep on keeping on, keep on consuming, reproducing, and assuming. But that doesn't mean we can’t try to recover a shred, a mere shred, of our individual dignity by at least acknowledging what we are doing, and taking at least one step, no matter how small to help heal our planet and our co-inhabitants.
We can start by acknowledging, accepting, that the thing that we believe makes us unique and special among all living things is simply an unknowable assumption we make. It is an arbitrary, self-serving action - and brings into question the very idea that humans can be considered moral creatures. 
Categories: Development

Upgrading the MiniDLNA on Seagate GoFlex Home from 1.0.22 to 1.1.2

Arun Bavera - Mon, 2014-05-26 00:19

1)SSH to root of your GOFLEX Home, the format of username will be:

USERNAME_hipserv2_seagateplug_XXXX-XXXX-XXXX-XXXX@<IPADDRESS>

Change to root access

sudo -E -s

 

My GoFlex Environment

bash-3.2# uname -a
Linux axentraserver.mygoflex.seagateshare.com 2.6.22.18 #16 Thu Jun 17 01:37:53 EDT 2010 armv5tejl armv5tejl armv5tejl GNU/Linux
HipServ 2.7.1-391

 

2) Install the ipkg as shown below:
http://ipkg.nslu2-linux.org/feeds/optware/gumstix1151/cross/unstable/
http://www.openstora.com/wiki/index.php?title=Installing_a_package_manager


  2. cd ~
  3. mkdir ipkg
  4. cd ipkg
  5. wget http://ipkg.nslu2-linux.org/feeds/optware/cs08q1armel/cross/stable/ipkg-opt_0.99.163-10_arm.ipk

  6. tar -xzf ipkg-opt_0.99.163-10_arm.ipk
  7. cp ./data.tar.gz /data.tar.gz
  8. cd /
  9. tar -xzf data.tar.gz
10. rm  data.tar.gz

11. echo src cs08q1armel http://ipkg.nslu2-linux.org/feeds/optware/cs08q1armel/cross/stable >> /opt/etc/ipkg.conf
12. /opt/bin/ipkg update

3) Update Path add it to /etc/profile
export PATH=/usr/local/bin:/usr/bin:/bin:/opt/bin:/usr/sbin:/opt/sbin:/usr/sbin:/sbin:/usr/local/sbin
export LD_LIBRARY_PATH=/usr/lib:/opt/lib:/lib:/usr/local/lib

 

4) Get miniDLNA  latest from
cd /home/0common
wget http://sourceforge.net/projects/minidlna/files/minidlna/1.1.2/

5)Install GCC:
ipkg list | grep gcc
/opt/bin/ipkg install gcc

6)Install Make:
ipkg install make

7) Install these version of libraries and create softlink in /usr/lib

wget http://www.openstora.com/files/albums/uploads/libjpeg_so_8_0_2.zip
wget http://www.openstora.com/files/albums/userpics/15704/libavutil_so_49_15_0.zip

ipkg install unzip

unzip -d /usr/lib libjpeg_so_8_0_2.zip
unzip -d /usr/lib libavutil_so_49_15_0.zip

cd /usr/lib
ln -s libjpeg_so_8_0_2 libjpeg.so
ln -s libjpeg_so_8_0_2 libjpeg.so.8
ln -s libavutil_so_49_15_0 libavutil.so.49
ln -s /usr/lib/libFLAC.so.7.0.0 /usr/lib/libFLAC.so.8

8) Now, install the latest MiniDLNA downloaded in

cd /home/0common

./configure; make; make install

Take a backup of old MiniDLNA

cp /usr/sbin/mindlna minidlna_1_0_22

cp /home/0common/minidlna1.1.2/minidlnad minidlna

Make sure the owner and permissions are same as old binary.


cp the old config file to /etc
ln -s /etc/miniupnpd/minidlna.conf /etc/minidlna.conf

Update the log directory in config file.
log_dir=/tmp/minidlna

Update the DB directory
db_dir=/var/cache/minidlna

# set this to strictly adhere to DLNA standards.
# * This will allow server-side downscaling of very large JPEG images,
#   which may hurt JPEG serving performance on (at least) Sony DLNA products.
strict_dlna=no

Common commands to start and start MiniDLNA
ssh - login
sudo -u
whoami -> "root"
/etc/init.d/minidlna.init stop
/etc/init.d/minidlna.init status
rm /tmp/minidlna/files.db -- will remove the minidlna database

With 1.1.2 the db will be in /var/cache/minidlna

Rebuild the Database
/usr/sbin/minidlna -f /etc/miniupnpd/minidlna.conf -R -d

/etc/init.d/minidlna.init start
/etc/init.d/minidlna.init restart

Running in Debug mode
/usr/sbin/minidlna -d -f /etc/miniupnpd/minidlna.conf

=====================================================


References:


http://blog.philippklaus.de/2011/04/install-archlinuxarm-on-the-seagate-goflex-home/
Install ArchLinuxARM on the Seagate GoFlex Home
http://archlinuxarm.org/platforms/armv5/seagate-goflex-home
http://doncharisma.com/2013/09/22/build-your-own-pro-nas-seagate-goflex-net-with-debian-linux-raid1-and-openmediavault/

Upgrading from 1.0.22 to 1.0.25
http://www.openstora.com/wiki/index.php?title=MiniDLNA_for_Stora/Updating

http://support.goflexhome.hipserv.com/en/reflash/index.html

Categories: Development

Oracle EM12c Release and Patch Schedules

Arun Bavera - Thu, 2014-05-22 09:12
PSU: Oracle Recommended Patches (PSU) for Enterprise Manager Base Platform (All Releases) (Doc ID 822485.1) For others 854428.1 à Quarterly

January 17th

April 17th

July17th

October 17th
==============================================================

Bundle Patches: MonthlyEnterprise Manager 12.1.0.3 Bundle Patch Master Note (Doc ID 1572022.1)Enterprise Manager 12.1.0.4.0 (PS3) Master Bundle Patch List (Doc ID 1900943.1)  Last Day of Every month Management Agent version 12.1.0.3 and higher
Cloud Control Plug-ins (both OMS-side and Agent-side), No OMS/WLS Patches
==============================================================
Release Schedule of Current Enterprise Manager Releases and Patch Sets (10g, 11g, 12c) (Doc ID 793512.1)
12.1.0.2 ->11-Sep-2012 New OMS Base Release like 12.1.0.3, 12.1.0.4  à Every 9 Months May-2014 12.1.0.3->28-Jun-2013
12.1.0.4->expected in Q2 CY14


==============================================================

Release Schedule for Enterprise Manager Cloud Control Plug-ins (Doc ID 1486995.1)




Refer:Oracle Premier Support Enterprise Manager Product News - February 2014 (Doc ID 1449687.1)

Categories: Development

Book Review - Oracle ADF Enterprise Application Development Made Simple (Second Edition)

Shay Shmeltzer - Fri, 2014-05-16 15:35

I got a copy of the new edition of Sten Vesterli's book about enterprise development with ADF, so I wanted to give a quick review update. I reviewed the first edition three years ago - and you can read that review here.

The second edition is not drastically different from the first one, and it shouldn't be. Most of the best practices that Sten pointed out in his original book still apply today. What might have changed a bit over the years are some of the tools used by enterprises to manage their application - and this is what some of  the updates are about - for example in addition to covering Subversion there is a Git section now. In addition Sten incorporate a few more architectural and conceptual bits and pieces that he collected over the past three years working with various customers. 

If you want to get a video summary of the type of topics Sten covers in this books you can watch this seminar he recorded for one of our virtual developer days

This is definitely a book that should be part of the reading material for groups about to embark on Oracle ADF development project - it will save them from some common mistakes and will put them on the right track to a well structured project and team.

It is worthwhile mentioning also that over the past year we at Oracle have been quite actively increasing the amount of material we are producing around those aspects and we centralize them in our ADF Architecture Square on OTN.

If you haven't visited that site or subscribed to the ADF Architecture YouTube channel - it's time you'll do this to. 

Categories: Development

Book Review - Oracle ADF Enterprise Application Development Made Simple (Second Edition)

Shay Shmeltzer - Fri, 2014-05-16 15:35

I got a copy of the new edition of Sten Vesterli's book about enterprise development with ADF, so I wanted to give a quick review update. I reviewed the first edition three years ago - and you can read that review here.

The second edition is not drastically different from the first one, and it shouldn't be. Most of the best practices that Sten pointed out in his original book still apply today. What might have changed a bit over the years are some of the tools used by enterprises to manage their application - and this is what some of  the updates are about - for example in addition to covering Subversion there is a Git section now. In addition Sten incorporate a few more architectural and conceptual bits and pieces that he collected over the past three years working with various customers. 

If you want to get a video summary of the type of topics Sten covers in this books you can watch this seminar he recorded for one of our virtual developer days

This is definitely a book that should be part of the reading material for groups about to embark on Oracle ADF development project - it will save them from some common mistakes and will put them on the right track to a well structured project and team.

To get the book or read a sample chapter click here

It is worthwhile mentioning also that over the past year we at Oracle have been quite actively increasing the amount of material we are producing around those aspects and we centralize them in our ADF Architecture Square on OTN.

If you haven't visited that site or subscribed to the ADF Architecture YouTube channel - it's time you'll do this to. 

Categories: Development

Pushing JDK to hosts using EM12c Custom Procedures

Arun Bavera - Fri, 2014-05-16 15:01
Installing JDK to Hosts using EM12c Custom Procedures
1) Goto Software Library:
clip_image002


2) Create a Folder keep all your Custom Procedures:
clip_image004





3) Right Click on this new Folder created and Create the Directive
clip_image006


4) Update the Directive details:
clip_image008


5) Provide the input parameters and choose the shell to be used for this directive:
clip_image010

6) Create the following script Install_jdk.sh and upload
# REM install_JDK.sh <JDK_Home_directory>
#!/bin/sh
export WorkDirectory=$PWD
echo "WorkDirectory=${WorkDirectory}"
export JDK_HOME=$1
echo "JDK Home=${JDK_HOME}"
echo "Creating directory==${JDK_HOME}"
/bin/sh -c 'mkdir -p $JDK_HOME'
echo "Changing the Permission"
/bin/sh -c 'chmod +x *'
echo "Current directory=$PWD"
echo "Installing JDK at ${JDK_HOME}"
/bin/sh -c 'cd $JDK_HOME;$WorkDirectory/jdk-6u45-linux-x64.bin'

Note: Make sure this file is saved with UNIX Terminators
clip_image012

7) Upload the JDK software from any agent Host as it exceeds 25MB can’t be uploaded from local desktop hosts.
clip_image014
Note: Make sure to choose the Install_jdk.sh script as the main file.

8) Save and Upload your directive.

9) Now you can create your custom Procedure and use this Directive to deploy the JDK to any host managed by the EM12c.

10)Goto Procedure Library:
clip_image016

11) Create new procedure:
clip_image018

12) Provide name and Temp stage directory info on Target host:
clip_image020

13) Choose Add Row to add one Target list row.

14) Provide Procedure Variable, this is Global Variable for the whole procedure:
clip_image022
15) Goto Procedure Steps tab: Choose Insert by choosing the “Default Phase”
clip_image024

16) Call it as “Install_JDK” and type “Directive”

clip_image026

17) Choose the Directive
clip_image028
Note:As of now you have re-chose the directive if there is any changes done to directive, choosing the latest version always not working

18)Goto next and choose “Run Directive” and “Perform Cleanup”
clip_image030

19)Assign the Global variable(JDK_Home) to this local variable JDK_HOME and Finish.

20) Now you are ready to Launch this procedure and deploy the JDK on any host managed by EM12c on any Directory where chosen Named Credentials has proper privileges.
Categories: Development

New Continuous Integration tutorial published

Lynn Munsinger - Mon, 2012-07-02 09:44
Hot off the press – a new continuous integration tutorial. It’s really not just about continuous integration, though! You’ll find it useful even if you aren’t using a continuous integration server like Hudson. It’s useful if you are doing any part of the scenario it documents: Setting up Team Productivity Center for your team and [...]

Advanced ADF eCourse, Part Deux

Lynn Munsinger - Tue, 2012-06-19 15:11
In February, we published the first in a series of FREE(!) online advanced ADF training: http://tinyurl.com/advadf-part1 The response to that course has been overwhelmingly positive as more and more people are moving past the evaluation/prototype stages with ADF and looking for more advanced topics. I’m pleased to relay the good news that the 2nd part [...]

Fun with Hudson, Part 1.1

Lynn Munsinger - Tue, 2012-06-05 09:19
Earlier I posted that I had used the following zip command in the ‘execute shell’ action for my Hudson build job: zip -r $WORKSPACE/builds/$JOB_NAME-$BUILD_NUMBER * -x ‘*/.svn/*’ -x ‘*builds/*’ This zips up the content of the exported source, so that I can send it on to team members who need the source of each build [...]

Hiring a Curriculum Developer

Lynn Munsinger - Tue, 2012-05-15 09:34
If you are an instructional designer with an eye for technologies like ADF, or if you are an ADF enthusiast and excel at creatively producing technical content, then ADF Product Management would like to hear from you. We’re looking for a curriculum developer to join our ADF Curriculum team, which is tasked with ensuring that [...]

Hiring a Curriculum Developer

Lynn Munsinger - Tue, 2012-05-15 09:34
If you are an instructional designer with an eye for technologies like ADF, or if you are an ADF enthusiast and excel at creatively producing technical content, then ADF Product Management would like to hear from you. We’re looking for a curriculum developer to join our ADF Curriculum team, which is tasked with ensuring that [...]

New ADF Insider on Layouts

Lynn Munsinger - Mon, 2012-03-26 13:22
I’ve published an ADF Insider session that helps de-mystify the ADF Faces components and how to work with them (and not against them), when building ADF applications. There’s also some great information on building ADF prototypes. Take a look here: http://download.oracle.com/otn_hosted_doc/jdeveloper/11gdemos/layouts/layouts.html