Why you have to learn about Zookeeper? If you are using application as Hbase, Neo4j, Solr, Accumulo and etc. read...
You can read much more on Apache ZooKeeper website. Anyway... you are looking for a book about Apache ZooKeeper. I mention a book titles ZooKeeper Distributed Process Coordination by Flavio Junqueira, Benjamin Reed.
A book guides readers to use Apache ZooKeeper manages distributed system. In a book has 3 parts - ZooKeeper Concepts and Basics, Programming with ZooKeeper, Administering ZooKeeper.
So, This book is good for readers who are interested in ZooKeepeer and who use applications relate with it. It will help readers to understand more about ZooKeeper Concepts and developer program with ZooKeeper. For me, I like it because it helps me idea to administrate Zookeeper.
A book covers:
- Learn how ZooKeeper solves common coordination tasks
- Explore the ZooKeeper API’s Java and C implementations and how they differ
- Use methods to track and react to ZooKeeper state changes
- Handle failures of the network, application processes, and ZooKeeper itself
- Learn about ZooKeeper’s trickier aspects dealing with concurrency, ordering, and configuration
- Use the Curator high-level interface for connection management
- Become familiar with ZooKeeper internals and administration tools
Written By: Surachart Opun http://surachartopun.com
In the spirit of Thanksgiving this week being celebrated on Thursday in the USA
This post is shared from our Oracle Java Community.
Hinkmond Wong's Weblog
First, we need to test the temperature probe before sticking it into unknown places, namely our delicious IoT bird on Thanksgiving. So, take your Go!Temp USB temperature probe and plug it into your Raspberry Pi device, just like in this photo.
If all went well on your Raspberry Pi, you should be able to bring up a terminal shell connected to your RPi and type "lsusb" to verify that the Go!Temp probe is now connected.
pi@raspberrypi ~ $ lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 0424:9512 Standard Microsystems Corp.
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp.
Bus 001 Device 005: ID 08f7:0002 Vernier EasyTemp/Go!Temp
If your output looks like above, especially the last line where it says the Vernier Go!Temp was recognized and is connected as Device 005, you are golden.
One last check before we start to program using a Java SE Embedded app to grab the temperature readings is to make sure the /dev/ldusb0 device is present. So, type this command and make sure your output matches:
pi@raspberrypi ~ $ ls -l /dev/ldusb0
crw------T 1 root root 180, 176 Nov 18 17:25 /dev/ldusb0
If all that looks good, you're ready for the next step which is to write a Java SE Embedded app to read the temperature values, and eventually write code with IoT intelligence to tweet out the status of your turkey while it's cooking so that it becomes an Internet of Things connected bird on Twitter. Look for that in the next part of this series... Mmmmm... I can almost smell that turkey roasting... " title="" style="border: none;" />
See the full series on the steps to this cool demo:
Internet of Things (IoT) Thanksgiving Special: Turkey Tweeter (Part 1)
The value that Oracle WebCenter brings to organizations has been well-documented but is also multi-faceted. When there are multiple facets to anything, that can also mean additional complexity that can seem daunting until you isolate the specific business challenges that need addressing, prioritization and then taking the time to "draw out" how to best address those challenges in the most cost-effective way possible.
Redstone Content Solutions is a long time Oracle partner and has assisted many organizations with their planning, deployment and technical expertise. Their mission statement is simply to "provide organizations with the tools necessary to securely accumulate and disseminate knowledge". This may include a variety of actual technologies deployed to work together in order to achieve specific business goals, including WebCenter Portal, Sites and Content.
To better illustrate how WebCenter can be used to meet common business objectives, Redstone has created an entertaining short video to show how WebCenter can best be used to create, share, manage and distribute information to the benefit of your business. Take a minute to check it out here and be sure to visit them at http://www.redstonecontentsolutions.com to learn more.
Security on the cloud has been a general topic of discussion since its debut as the new standard in business IT. As these storage services mature, however, the safety of the cloud is being consistently reinforced.
Securing the cloud before sending data away
BizTech reported that new strategies are being developed to help decision-makers secure their information before deployment. Encryption, for instance, enables IT managers to attach a complex code to data before it is outsourced. With this solution, only owners of the decryption key can unlock the information.
The cloud provides corporations with remote access, which is especially useful for decision-makers who maintain a large volume of employees. According to the source, this strategy is also useful for big data corporations who have varying levels of information sensitivity.
In addition to pre-transition security, the source reported hat cloud solutions are enabling IT teams to craft hybrid services that can be launched across existing storage infrastructures. One of the primary concerns associated with new cloud deployments is having to give up direct control over some digital management responsibilities.
Because of the sensitive nature of information in some industries, such as banking institutions and health-related facilities, losing ultimate authority over client records can be daunting for decision-makers, but by integrating an on-premises solution with the cloud, it's possible to retain data in-house.
Using the cloud's agility to reinforce customization
The agility of cloud-based solutions also enhances security. The cloud's enhanced flexibility enables decision-makers to deploy unique and customizable applications across digital architectures. Database administration services, for example, attach to the cloud and provide companies with increased security, more categorization options and more freedom to focus on other tasks.
Along with providing businesses greater protection against cybercrime, remote DBA experts will also help maintain information, allowing decision-makers to explore more options. PandoDaily reported that with a little ingenuity, it's possible to make the cloud more secure than legacy computing methods. For smaller enterprises, especially, the enhanced customization and the access to improved computing features make generating a high-quality network a cost-effective and innovative solution.
As business leaders begin considering their options, it's important for IT managers to recognize how the cloud's security can be used to safeguard the company's digital assets.
More Excel SupportDodeca has always been strong on Excel version support and this version delivers even more Excel functionality. Internally, we use the SpreadsheetGear control, which does a very good job with Excel compatibility. This version of Dodeca integrates a new version of SpreadsheetGear that now has support for 398 Excel functions including the new SUMIFS, COUNTIFS, and CELL functions.Excel Page Setup DialogThe new version of Dodeca includes our implementation of the Excel Page Setup Dialog which makes it easy for users to customize the printing of Dodeca views that are based on Excel templates. Note that for report developers, the Excel Page Setup has also been included in the Dodeca Template Designer.
New PDF View TypeCustomers who use PDF files in their environments will like the new PDF View Type. In previous releases of Dodeca, PDF documents displayed in Dodeca opened in an embedded web browser control. Beginning in this version, Dodeca includes a dedicated PDF View type that uses a specialized PDF control.
View Selector TooltipsFinally, users will like the new View Selector tooltips which optionally display the name and the description of a report as a tooltip.
PerformancePerformance is one of those things that users always appreciate, so we have added a new setting that can significantly improve performance in some circumstances. Dodeca has a well-defined set of configuration objects that are stored on the server and we were even awarded a patent recently for the unique aspects of our metadata design. That being said, depending on how you implement reports and templates, there is the possibility of having many queries issued to the server to check for configuration updates. In a few instances, we saw that optimizing the query traffic could be beneficial, so we have implemented the new CheckForMetadataUpdatesFrequencyPolicy property. This property, which is controlled by the Dodeca administrator, tells Dodeca whether we should check the server for updates before any object is used, as was previously the case, only when a view opens, or only when the Dodeca session begins. We believe the latter case will be very useful when Dodeca is deployed in production as objects configured in production often do not change during the workday and, thus, network traffic can be optimized using this setting. The screenshot below shows where the administrator can control the update frequency.
Though users will like these features, we have put a lot of new things in for the people who create Dodeca views and those who administer the system. Let’s start with something that we think all Dodeca admins will use frequently.Metadata Property Search UtilityAs our customers continue to expand their use of Dodeca, the number of objects they create in the Dodeca environment continues to grow. In fact, we now have customers who have thousands of different objects that they manage in their Dodeca environments. The Metadata Property Search Utility will help these users tremendously.
This utility allows the administrator to enter a search string and locate every object in our system that contains that string. Once a property is located, there is a hyperlink that will navigate to the given object and automatically select the relevant property. This dialog is modeless, which means you can navigate to any of the located items without closing the dialog.
Note: this version does not search the contents of Excel files in the system.Essbase Authentication ServicesIn the past, when administrators wished to use an Essbase Authentication service to validate a login against Essbase and automatically obtain Dodeca roles based on the Essbase user’s group memberships, they had to use an Essbase connection where all users had access to the Essbase application and database. The new ValidateCredentialsOnly property on both of the built-in Essbase Authentication services now flags the service to check login credentials at the server-level only, eliminating the need for users to have access to a specific Essbase database.New Template Designer ToolsPrior to Dodeca 6.x, all template editing was performed directly in Excel. Since that time, however, most template design functionality has been replicated in the Dodeca Template Designer, and we think it is preferable due to the speed and ease of use with which users can update templates stored in the Dodeca repository. We have added a couple of new features to the Template Designer in this version. The first tool is the Group/Ungroup tool that allows designers to easily apply Excel grouping to rows and/or columns within the template. The second new tool is the Freeze/Unfreeze tool that is used to freeze rows and/or columns in place for scrolling.Parameterized SQL Select StatementsSince we introduced the SQLPassthroughDataSet object in the Dodeca 5.x series, we have always supported the idea of tokenized select statements. In other words, the SQL could be written so that point-of-view selections made by users could be used directly in the select statement. In a related fashion, we introduced the concept of parameterized insert, update, and delete statements in the same version. While parameterized statements are similar in concept to tokenized statements, there is one important distinction under the covers. In Dodeca, parameterized statements are parsed and converted into prepared statements that can be used multiple times and results in more efficient use of server resources. The parameterized select statement was introduced in this version of Dodeca in order for customers using certain databases that cache the prepared statement to realize improved server efficiency on their select statements.Workbook Script Formula Editor ImprovementsWe have also been working hard to improve extensibility for developers using Workbook Scripts within Dodeca. In this release, our work focused on the Workbook Script Formula Editor. The first thing we added here is color coding that automatically detects and distinguishes Excel functions, Workbook Script functions, and Dodeca tokens. In the new version, Excel functions are displayed in green, Dodeca functions and parentheses are displayed in blue, and tokens are displayed in ochre. Here is an example.
In addition, we have implemented auto-complete for both Excel and Dodeca functions.
New SQLException EventVersion 6.6 of Dodeca introduces a new SQLException event that provides the ability for application developers to customize the behavior when a SQL Exception is encountered.XCopy Release DirectoryBeginning in version 6.6, the Dodeca Framework installation includes a pre-configured directory intended for customers who prefer to distribute their client via XCopy deployment instead using Microsoft ClickOnce distribution. The XCopy deployment directory is also for use by those customers who use Citrix for deployment.Mac OS X Release DirectoryThe Dodeca Framework installation now includes a pre-compiled Dodeca.app deployment for customers who wish to run the Dodeca Smart Client on Mac OS X operating systems. What that means is that Dodeca now runs on a Mac without the need for any special Windows emulators. Dodeca does not require Excel to run on the Mac (nor does it require Excel to run on Windows for that matter), so you can certainly save your company significant licensing fees by choosing Dodeca for your solution.
In short, you can see we continue to work hard to deliver functionality for Dodeca customers. As always, the Dodeca Release Notes provide detailed explanations of all new and updated Dodeca features. As of today, we have decided to make the Release Notes and other technical documents available for download to non-Dodeca customers. If you are curious about all of the things Dodeca can do, and if you aren't afraid to dig into the details, you can now download our 389 page cumulative Release Notes document from the Dodeca Technical Documents section of our website.
Our new team members, Raymond and Tony, have been busy in their short time with us, and they’re embracing the AppsLab way.
What way is that you ask? Since the beginning, we’ve always started with an idea and moved quickly to build something conceptual to see how and if the idea works.
Connect began life as the IdeaFactory, which Rich (@rmanalan) put together in 24 hours to give life to our idea about enterprise social networking. More recently, Anthony’s (@anthonysali) new toy, the Google Glass, begat the Fusion CRM Glass app.
To be clear, none of this is product. It’s not even really project work, although we do sometimes launch projects based on the initial concept work. This is just smart developers, messing around with ideas, trying to see what works.
Over the years, we’ve built lots of these demos, which I’m calling concept demos lately. Some have evolved into full-scale projects. Others have been moth-balled into our Git repo, which I’m told has something like 40-some odd projects in various states of completeness.
I like to think that code never dies. It just waits around for the right circumstances.
Sorry about that, won’t happen again.
Anyway, with Anthony and Noel (@noelportugal) tied up with travel and other projects, Raymond and Tony have taken the baton and cranked out a couple of cool concept demos.
First, they collaborated to build a working geo-fencing demo. The idea here is that data on a device should be subject to physical location, e.g. patient data in a hospital, customer-sensitive bank data. If the device is within the fence, data exist and can be accessed; when the device leaves the fence perimeter, the data are removed from the device and cannot be retrieved from the server.
Here are some shots of the concept demo at work.
Tony did the groundwork development for this one, and Raymond cleaned it up to demo more cleanly. The toughest part of this one was spoofing the GPS with a fake location to fool it into believing it was inside/outside the geo-fence.
Second, Jeremy, our overlord, owns a Pebble watch, so we’ve been messing about with one for giggles. Possibly as a joke, Jeremy said we should build a watchface app for sales reps that showed “motivational” metrics like days to quarter close and percentage of sales quota achieved.
So, Raymond did that.
I guess the lesson is that it’s not always a good idea to joke around developers.
Why do we do stuff like this?
Aside from proving out ideas, projects like these, the Glass app and the Leap Motion-controlled robot arm allow the guys to go hands-on with the SDKs and APIs of devices we may actually build for in the future. These experiences are incredibly valuable because when it comes time to do a full-scale project, they have a baseline understanding of what we can reasonably do and how easy or difficult it will be.
That experience leads to much better estimates of development times, and it removes some of the uncertainty involved. Oh, and it helps control the scope early in a project, which makes execution and timely delivery achievable.
If you’re counting, that’s a win-win-win-win-win, or something.
Yeah, concept demos are usually rough around the edges, but they’re baked enough to give an idea of what’s possible. Plus, concept demos get done quickly, so ideas can be vetted and move on or be tabled without spending a ton of time and effort, e.g. Raymond and Tony banged out the geo-fencing concept demo in less than two weeks, and Raymond built the Pebble concept in under a week.
And that’s real time. They were doing other things too.Possibly Related Posts:
- Oracle Fusion Glass App
- First 3 days as a Glass Explorer (Day 1)
- We’ve Grown
- Messing around with Glass and Fusion CRM for Kscope 13
- Oracle Social Network Developer Challenge: Fishbowl Solutions
1. Ventana Research awarded their 2013 Technology Innovation Award for Business Innovation in Human Capital Management to Oracle Fusion HCM.
2. Several recent Fusion HCM Go-Live Announcements:
- American Career College and West Coast University
- CAJ Senior Health Care
- Toshiba Medical Systems
3. Some significant and hard-fought sales wins:
- BT Invests
None of this is really a surprise to me. In fact, the ball is getting rolling a bit early from my POV. Looking at market history, it takes about 5 years from a general availability release of new Oracle packaged enterprise apps before we begin to see success…Oracle plays a long game with new product releases, especially in the enterprise applications world. Remember when Oracle's Steve Miranda continually reminded us that Fusion Applications were more of a journey than an endpoint? GA of strong product + willingness to play the long game + incremental development...this is what Steve meant.
Connecting the dots here, and being aware of increased customer interest in Fusion HCM in the US market, I see this as the beginning of the tipping point for Fusion HCM.
As Oracle continues to emphasize HCM on cloud, a SaaS approach for small and medium enterprises, and a continued effort of incremental development for Fusion HCM, I expect to see an even bigger build of momentum in 2014.
The kettle is beginning to boil!
A while back, I promised some details on the Google Now TV Card I found accidentally.
I was watching TV via an HDTV antenna and happened to pop open Google Now for some reason or another. Now showed me this card:
Freaky, right? I dismissed it, but my curiosity was peaked. So, I did some digging about the TV Card and went back to give it whirl.
The card only works on broadcast TV, which makes sense when you reverse engineer it a little. Google Now knows where you are, and based on that, can determine the shows that are being broadcast. That helps narrow down the possibilities, but even given that information, I found the card a bit tough to trigger.
I did my testing during daytime TV, and it failed to detect the Ellen DeGeneres show and another show I tried. It did finally work for the Fox broadcast of the MLB playoff series between the Tigers and Red Sox.
Here is the card it showed:
If I remember correctly, the announcers were talking about Torii Hunter.
Pretty interesting stuff, not mind-blowing, but interesting. This is a pretty powerful example of what Google wants to do though, which is integrate all it knows about the world and you, a.k.a. its knowledge graph, and provide what it thinks might be useful to you at the moment.
Find the comments.Possibly Related Posts:
- Why Stickers are My New Business Card
- Test Driving Google Wallet
- Competing Innovation in Credit Card Payments?
- The WebCenter Customer Spotlight and OpenWorld Approaches
- Some Light Testing of Google Now
If you hurry, you can watch their episode on Hulu. If you decide to wait, they appear in Season 5 Episode 10. Paul and family are the very first segment, so you won’t have to watch the entire episode, although I did because I’ve never seen Shark Tank. It’s an interesting show.
The premise is simple; companies seeking investment pitch a panel of investors, who, if they’re interested, commit a sum of money in exchange for a stake in the business.
Now for the background. Paul founded this little team back in 2007, along with Rich (@rmanalan) and me. Many of you may know Paul, but you might not know that he and his wife started a little lunchbox business called Yubo in their spare time. I think that was in 2009.
Using their savings, they set out to solve a common problem for families, the lunchbox and the jumble of containers and baggies that go into it. Yubo comes with BPA-free, dishwasher-safe containers that fit snugly inside, along with a reusable cold pack.
Plus, the Yubo’s faceplate is customizable and replaceable. It’s an ingenious product. I bought one for my daughter; she loves it; I’ve known Paul for years, etc. Consider that your disclaimer.
I remember the inception days of Yubo. Paul told me about working with an industrial designer and taking late calls with manufacturers overseas, all in his free time. It all seemed very draining, but like every small business, they soldiered through it because they believed in the idea.
Anyway, it was oddly gratifying for me to see Paul and family on national TV, successfully pitching this panel of luminaries. I can only imagine how elated they felt when they struck a deal.
If you’re wondering, Paul recently left Oracle, again; this time for a small company called Achievers.
Good luck dude. Without you, we wouldn’t be doing cool stuff here.Possibly Related Posts:
- Hans Rosling on the Joy of Stats
- Imitation as Flattery
- Find Paul at the Churchill Club on Tuesday, June 17
- Podcast from Paul’s Panel at the Churchill Club
- Does Technology Make You Happier?
In the spirit of Thanksgiving this week being celebrated on Thursday in the USA
This post is shared from our Oracle Java Community.
Hinkmond Wong's Weblog
If you're vegetarian, don't worry, you can follow along and just run the simulation of the Turkey Tweeter, or better yet, try a tofu version of the Turkey Tweeter.
Here is the parts list:
1 Vernier Go!Temp USB Temperature Probe 1 Uncooked Turkey 1 Raspberry Pi (not Pumpkin Pie) 1 Roll thermal reflective tapeYou can buy the Vernier Go!Temp USB Temperature Probe for $39 from here:http://www.vernier.com/products/sensors/temperature-sensors/go-temp/. And, you can get the thermal reflective tape from any auto parts store. (Don't tell them what you need it for. Say it's for rebuilding your V-8 engine in your Dodge Hemi. Avoids the need for a long explanation and sounds cooler...) " title="" style="border: none;" />
The uncooked turkey can be found in your neighborhood grocery store. But, if you're making a vegetarian Tofurkey, you're on your own... The Java Embedded app will be the same, though (Java is vegan). " title="" style="border: none;" />
So, grab all your parts and come back here for the next part of this project...
This article has been updated on November 26 to include the option regarding downloading the MDS content.
The Meta Data Services (or MDS for short) of Oracle's SOA/BPM Suite is used to manage various types of artifacts like:
- Process models created with Process Composer,
- Abstract WSDL's and XSD's,
- Domain Value Map's (DVM), and even
- Artifacts of deployed composites.
To create an MDS connection go to the Resource Palette -> New Connection -> SOA-MDS. This will pop-up a tab from which you can create a database connection to the MDS for example the dev_mds schema. Having created the database connection you have to choose the partition to use for the SOA-MDS connection. To be able to check-out processes created whith Composer from the MDS or to save them in the MDS, you create a SOA-MDS that uses the obpm partition. As the name already suggests, this is in BPM-specific partition. To browse the other artifacts I mention above, you use the soa-infra partion, which is shared by both SOA and BPM.
In the figure below you can see two types of connections, above to the soa-infra and below to the obpm partition. In the (soa-infra) apps you can find the reusable artifacts that you have deployed explicitly (like abstract WSDL's, XSD's, EDL's).
What you also see is a deployed-composites folder that shows all composites that have been deployed. When expanding a composite, you will find that all artifacts are shown. This is a much easier way to verify that you do not deploy too many artifacts to the server then by introspecting the SAR file, I would say. Except for .bpmn files (that at the time of writing are not yet recognized by this MDS browser) you can open all plain text files in JDeveloper.
Downloading the MDS from Enterprise ManagerNow let's assume that you have not been given access to the MDS's DB schema on the environment (perhaps because it is Production), but you do have access to the Enterprise Manager. For this situation my dear colleague Subhashini Gumpula pointed me to the possibility to download the content from the MDS as follows:
soa-infra -> Adminstration -> MDS Configuration -> and then on the right side of the screen: Export.
This will download a soa-infra_metadata.zip file with its content!
Looking up Artifacts in the MDS Using a BrowserNow let's assume that you also have not been given access to Enterprise Manager on the environment, but you can access using the HTTP protocol. Thanks to my dear colleague Luc Gorrisen I recently learned that you can browse it using part of the URL of the composite, as follows:
For example, to look up the abstract WSDL of some ApplicationService that is used by some StudentRegistration business process, I can use the following URL.
Mind you, this is not restricted to only the WSDL's it is using.
Ain't that cool?!
It’s fair to say that Purdue University has sparked several important conversations in ed tech through their work on Course Signals. First, they pretty much put the retention early warning system as a product category on the map, conducting ground-breaking research and building a system that several major ed tech players have either licensed or imitated. More recently, they have sparked a conversation about the state of ed tech research and peer review as their more recent research has been called into question. I highly recommend reading the comment threads on these two posts to get a sense of that conversation.
Now I think Purdue may spark a third conversation—this time around the ethics of institutional learning analytics research and commercialization. Because there is no question in my mind that they have a serious ethical problem on their hands.
While I have no proof that Purdue is aware of the concerns that have been raised about the Course Signals research, I think it highly unlikely that they are unaware, after articles have been published in Inside Higher Ed and the Times Higher Education. The questions have been out for a month now, and so far we have nothing in the way of an official response from the university.
That’s a big problem for several reasons. First, has have been mentioned here before, Purdue has licensed its technology to Ellucian for sale to other schools. In other words, the university is effectively making money on the strength of research claims that have now been called into question. Second, the people who conducted and published the research are not tenured faculty but non-tenurable staff, and they did so using institutional data the access to which Purdue ostensibly controls. It seems overwhelmingly likely that the researchers whose work is being challenged are effectively powerless to respond without permission and support from their institution. If so, then these people are being put in a terrible position. They are listed as the authors of the research, but they do not have the power that an academic Principal Investigator would have to be properly accountable for the work.
For both of these reasons, I believe that Purdue has an ethical obligation as an institution to respond to the criticism. Since they seem disinclined (or at least slow) to do so of their own accord, perhaps some appropriate pressure can be brought to bear. If you are an Ellucian customer, I urge you to contact them and ask why there has not been an official response to the challenge regarding the research. Both of the partners here should know that their brand reputations and therefore future revenue streams are at stake here. (I would be grateful if you would let me know, either publicly or privately, if you take this step. I would like to keep track of the pressure that is being brought to bear. I will keep your name and that of your institution private if you want me to.)
But I also think there is a broader conversation that needs to happen about the general problem. On the one hand, schools have an obligation to protect the privacy of their students. This makes releasing student success research data challenging. On the other hand, if the research cannot be properly peer reviewed because it cannot be shared, then we cannot develop confidence in the research that is coming to us. This problem is exacerbated when research is conducted by staff whose independence is not protected, and by the increasing tendency of institutions to commercialize their educational technology research and development work. There needs to be a community-developed framework to help facilitate the safe and appropriate sharing of the data so institutions can be held accountable for their research and the staff who conduct that research can be appropriately protected.
How can you conditionally turn cells borders on and off in Publishers RTF/XSLFO templates? With a little digging you'll find what appears to be the appropriate attributes to update in your template. You would logically come up with using the various border styling options:
border-top|bottom|left|right-width border-top|bottom|left|right-style border-top|bottom|left|right-color
Buuuut, that doesnt work. Updating them individually does not make a difference to the output. Not sure why and I will ask but for now here's the solution. Use the compound border formatter border-top|bottom|left|right. This takes the form ' border-bottom="0.5pt solid #000000". You set all three options at once rather than individually. In a BIP template you use:
<?if:DEPT='Accounting'?> <?attribute@incontext:border-bottom;'3.0pt solid #000000'?> <?attribute@incontext:border-top;'3.0pt solid #000000'?> <?attribute@incontext:border-left;'3.0pt solid #000000'?> <?attribute@incontext:border-right;'3.0pt solid #000000'?> <?end if?>
3pt borders is a little excessive but you get the idea. This approach can be used with the if@row option too to get the complete row borders to update. If your template will need to be run in left to right languages e.g. Arabic or Hebrew, then you will need to use start and end in place of left and right.
For the inquisitive reader, you're maybe wondering how, did this guy know that? And why the heck is this not in the user docs?
Other than my all knowing BIP guru status ;0) I hit the web for info on XSLFO cell border attributes and then the Template Builder for Word. Particularly the export option; I generated the XSLFO output from a test RTF template and took a look at the attributes. Then I started trying stuff out, Im a hacker and proud me! For the users doc updates, I'll log a request for an update.
If you tuned into last week’s blog post talking I talked about the new release of WebCenter Capture. Well, low and behold Oracle released a new version of OFR.
If you look closely, this a step in the right direction as the verifer component is now available as a web application module. That’s right, no more client installations of verifier and access via a web browser!
At Fishbowl we see this as a huge advantage to our customers with no more client installations of verifier and only keeping a few servers busy, while verifiers access the result via a browser. This helps our customers in two ways, 1) improvement of the management of the overall system and 2) lowering their costs.
OFR is now architected in more of a distributed manner, this means a lot of the configurations will need to be centrally located on a database, instead of the file system. Configurations like documents, batches (jobs), project references, users & groups and etc are stored there. But note with 188.8.131.52 the file system is no longer supported and therefore will be required to upgrade to a database for batches.
When looking at the new web verifier, I have to say I was fully expecting an ADF application because that has been the trend with most of the Fusion Middleware technologies which has adopted this framework. If you look at the screenshot below it looks very similar to the client verifier, well at least the layout is similar.
Overall, this a great next step for imaging and the solutions that we can provide to our customers. With the web verifier, the clear benefit is no longer having the need to install verifier on each verifier station.
Like I stated in my post last week about ODC, this is a very excting time for imaging. We are seeing a lot of demand for it. Companies are looking to move away from paper based business processes to help save costs and Fishbowl is willing to join you in that journey in implmenting an automated imaging solution with the new versions of these produts. If you have any questions about imaging solutions or the new version of OFR please contact us at email@example.com or 952–465–3400.
Brad Bukacek Jr is a Senior Software Consultant/Team Lead, who focuses on building Oracle imaging solutions. He also has an extensive background in architecting & developing solutions using Oracle SOA/BPM Suite. Brad is also heavily involved in the Oracle community through IOUG (Independent Oracle Users Group) and is the Middleware Track Manager for IOUGs yearly conference, Collaborate & a Senior Contributing Editior for Select Journal. Follow his tweets at @bbukacek
The post New Version of Oracle WebCenter Forms Recognition! appeared first on C4 Blog by Fishbowl Solutions.
I have been thinking about writing a Pythian blog for long time, and today I finally took the opportunity.
In the DBA life, it’s common to get a request to move the database across servers due to a RDBMS upgrade plan or receiving new hardware. It’s not common, however, to receive a request to move RDBMS Oracle home within the same server. This request may arrive due to improper planning of creating the Oracle home into the root mount point on Unix platforms, and C:\\ drive on Windows platforms(system mountpoint/drive).
The cloning feature introduced by Oracle from the 10gR2 version become handy to work with on this request. The use of clone.pl script on Unix platform is quite straightforward, as we have full control over Unix processes. The thread architecture on Windows platform makes this a bit different, but not complex.
Let’s assume the current Oracle home is located at “C:\\oracle\\product\\11.1.0″ directory, and the new directory planning to move is “D:\\oracle\\product\\11.1.0″ for example. As usual, keep the database name as TEST. The steps below describe the activities required.
Step 1. Log into Windows server as local server user, which is part of local administrator and ora_dba groups. Let the existing Oracle database run and start to copy the entire contents from “C:\\oracle\\product\\11.1.0″ directory to “D:\\oracle\\product\\11.1.0″ directory. Ensure the copy process is completed without any issues.
Step 2. Take existing Oracle home inventory details for reference. Open a command prompt window (Window I) and execute these commands.
Step 3. Open a new command prompt (Window II) and set the environment variables appropriately.
C:\>set PERL5LIB=D:\\oracle\\product\\11.1.0\\perl\\5.8.3\\lib ==> This depends on Perl version exists under oracle home, may differ from version to version.
Step 4. Run the clone.pl script from Window II.
C:\>perl clone.pl ORACLE_HOME=”D:\\oracle\\product\\11.1.0″ ORACLE_HOME_NAME=”OraDB11gR1_home” ORACLE_BASE=”D:\\oracle”
Execution of this command should complete without any issues. Review the log file C:\\Program Files\\Oracle\\Inventory\\logs\\cloneActions<DATE>.log for the verification.
Excerpts from the log file:
INFO: ca page to be shown: false
INFO: exitonly tools to be excuted passed: 0
*** End of Installation Page***
The cloning of OraDB11gR1_home was successful. ==> Should get this message.
Step 5. Execute the following commands from window II for the newly cloned home configuration verfication.
C:\\>opatch version ==> Output should match with the output obtained on step 2.
C:\\>opatch lsinventory ==> Output should match with the output obtained on step 2.
Step 6. Get maximum 15 minutes downtime for the database and bring down the TEST database. Open the services utility and stop “OracleOraDB11g_homeTNSListener” service and “OracleServiceTEST” service.
Step 7. On Window I, execute this command to delete the existing listener service from the server.
Step 8. Execute this command from Window I to delete the existing database services from the server.
C:\\>ORADIM -DELETE -SID TEST
Step 9. Open the services utility and confirmed all the services belongs to old oracle home including “Oracle TEST VSS Writer Service” and “OracleJobSchedulerTEST” are deleted.
Step 10. Invoke Oracle Net Configuration Assistant from Window II to configure new listener service.
Step 11. Create new database service from new oracle home from Window II.
C:\\>ORADIM -NEW -SID TEST -SYSPWD *** -STARTMODE auto -SPFILE
Note: This command would start the database instance too.
Step 12. Open services utility and confirmed the following services got created from new oracle home.Modify the “Startup Type” for these services accordingly.
Oracle TEST VSS Writer Service
Now work with your application administrator, and confirm that everything works fine :)
if you use tnsping.exe and sqlplus.exe, the way the sqlnet.ora and tnsnames.ora are located differs
Let’s take the following setup
C:\tmp>type dir1\sqlnet.ora NAMES.DEFAULT_DOMAIN = (EXAMPLE.COM) NAMES.DIRECTORY_PATH = (TNSNAMES) C:\tmp>type dir1\tnsnames.ora db.example.com=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=srv1.example.com)(PORT=1521))(CONNECT_DATA=(SID=db01))) db.example.org=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=srv1.example.org)(PORT=1521))(CONNECT_DATA=(SID=db01))) C:\tmp>type dir2\sqlnet.ora NAMES.DEFAULT_DOMAIN = (EXAMPLE.ORG) NAMES.DIRECTORY_PATH = (TNSNAMES) C:\tmp>type dir2\tnsnames.ora db.example.com=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=srv2.example.com)(PORT=1521))(CONNECT_DATA=(SID=db02))) db.example.org=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=srv2.example.org)(PORT=1521))(CONNECT_DATA=(SID=db02)))
You set TNS_ADMIN to dir1 and your current directory is dir2.
Let’s try TNSPING.EXE first
C:\tmp>cd dir2 C:\tmp\dir2>set TNS_ADMIN=C:\tmp\dir1 C:\tmp\dir2>tnsping db TNS Ping Utility for 64-bit Windows: Version 184.108.40.206.0 - Production on 25-NOV-2013 15:47:31 Copyright (c) 1997, 2013, Oracle. All rights reserved. Used parameter files: C:\tmp\dir1\sqlnet.ora Used TNSNAMES adapter to resolve the alias Attempting to contact (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=srv2.example.com)(PORT=1521))(CONNECT_DATA=(SID=db02))) OK (0 msec)
TNSPING.EXE is using the sqlnet.ora in %TNS_ADMIN% directory (EXAMPLE.COM domain) and the tnsnames.ora in the current directory (db02)
Let’s try with sqlplus
C:\tmp>cd dir2 C:\tmp\dir2>set TNS_ADMIN=C:\tmp\dir1 C:\tmp\dir2>sqlplus -L system@db SQL*Plus: Release 220.127.116.11.0 Production on Mon Nov 25 16:01:15 2013 Copyright (c) 1982, 2013, Oracle. All rights reserved. Enter password: Connected to: Oracle Database 11g Enterprise Edition Release 18.104.22.168.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options SQL> select * from global_name; GLOBAL_NAME ------------------------------------------------- DB02.EXAMPLE.ORG
SQLPLUS.EXE is using the sqlnet.ora in the current directory (EXAMPLE.ORG) and the tnsnames.ora in the current directory (db02)
This does not reproduce on Linux
If you redirect a not-completly-string output (like a spfile) to a file in powershell, you may not see the same in the file as in the output
- without redirection
PS C:\> Select-String "compatible" .\spfileDB01.ora spfileDB01.ora:13:*.compatible='22.214.171.124.0'
- with redirection
PS> Select-String "compatible" .\spfileDB01.ora > compatible.txt PS> vim -b .\compatible.txt ÿþ^M^@ ^@s^@p^@f^@i^@l^@e^@D^@B^@0^@0^@1^@.^@o^@r^@a^@:^@1^@3^@:^@*^@.^@c^@o^@m^@p^@a^@t^@i^@b^@l^@e^@=^@'^@1^@1^@.^@2^@.^@0^@.^@4^@.^@0^@'^@^M^@ ^@
- With redirection and conversion to ascii
PS> Select-String "compatible" .\spfileDB01.ora | Out-File -Encoding ASCII .\compatible.txt PS> vim .\compatible.txt spfileDB01.ora:13:*.compatible='126.96.36.199.0'
With Out-File (instead of >), you can specify the encoding, which does the trick
As cloud technologies continue to advance, entrepreneurs are becoming more capable of utilizing these services to launch new enterprises. Because the cloud provides business owners with increased flexibility, managing information according to the specific needs of the company has become more feasible. Additionally, the scalable storage features of cloud-based infrastructures make saving capital expenses easier for burgeoning enterprises.
Simplifying the tech process for new businesses
According to Enterprise Irregulars, the cloud's unique memory capabilities are particularly important for small companies. While new businesses are in the startup phase, it can be difficult to determine how much storage is necessary until after an adjustment period has occurred. Additionally, because decision-makers are typically responsible for handling IT, it is also crucial that simple digital architectures are deployed that can be easily manipulated.
As cloud services provide all of these functions, the source noted that small business owners should consider leveraging one of these solutions first. A cloud-enhanced infrastructure can also reach a wider audience than legacy strategies. Now that the cloud has become so widely used across a range of industries, such as retail, academia and law, individuals are often already familiar with how the technology works when they reach their new positions. For this reason, the source reported that companies utilizing the cloud have a distinct advantage over organizations that do not and, as such, will require additional employee training.
According to Cloud Tweaks, unique applications deployed across a digital architecture can also provide young enterprises with better cost savings. Database administration services, for instance, can be attached to an organization's information to provide decision-makers with more security and tech support. For small companies that are just starting out, this can be a way to relieve owners from having to focus on IT .
As decision-makers begin the process of building their new enterprises, it's important that cost-effectiveness remains a vital component. By considering the above-mentioned strategies, companies can begin acting within their industries efficiently and with a plan to save on capital.
RDX offers a full suite of cloud migration and administrative services that can be tailored to meet any customer's needs. To learn more about our full suite of cloud migration and support services, please visit our Cloud DBA Service page or contact us.
Just a quick note to say the top-level approval for OTNYathra 2014 – India OTN Tour has been granted. Next, we’ve got to submit our individual travel approvals and see how that goes. If everything goes to plan, I will be representing the Oracle ACE Program on this tour in February, which visits the following locations.
- Jallandar – 18th February
- Delhi - 20th February
- Mumbai - 22nd February
- Pune - 23rd February
- Hyderabad - 25th February
- Bangalore - 27th February
- Chennai – 1st March
Fingers crossed everything goes to plan.
Tim…OTNYathra 2014 : India OTN Tour was first posted on November 25, 2013 at 9:52 am.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.
Generalizing about SaaS (Software as a Service) is hard. To prune some of the confusion, let’s start by noting:
- SaaS has been around for over half a century, and at times has been the dominant mode of application delivery.
- The term multi-tenancy is being used in several different ways.
- Multi-tenancy, in the purest sense, is inessential to SaaS. It’s simply an implementation choice that has certain benefits for the SaaS provider. And by the way, …
- … salesforce.com, the chief proponent of the theory that true multi-tenancy is the hallmark of true SaaS, abandoned that position this week.
- Internet-based services are commonly, if you squint a little, SaaS. Examples include but are hardly limited to Google, Twitter, Dropbox, Intuit, Amazon Web Services, and the company that hosts this blog (KnownHost).
- Some of the core arguments for SaaS’ rise, namely the various efficiencies of data center outsourcing and scale, apply equally to the public cloud, to SaaS, and to AEaaS (Anything Else as a Service).
- These benefits are particularly strong for inherently networked use cases. For example, you really don’t want to be hosting your website yourself. And salesforce.com got its start supporting salespeople who worked out of remote offices.
- In theory and occasionally in practice, certain SaaS benefits, namely the outsourcing of software maintenance and updates, could be enjoyed on-premises as well. Whether I think that could be a bigger deal going forward will be explored in future posts.
For smaller enterprises, the core outsourcing argument is compelling. How small? Well:
- What’s the minimum level of IT operations headcount needed for mission-critical systems? Let’s just say “several”.
- What does that cost? Fully burdened, somewhere in the six figures.
- What fraction of the IT budget should such headcount be? As low a double digit percentage as possible.
- What fraction of revenues should be spent on IT? Some single-digit percentage.
So except for special cases, an enterprise with less than $100 million or so in revenue may have trouble affording on-site data processing, at least at a mission-critical level of robustness. It may well be better to use NetSuite or something like that, assuming needed features are available in SaaS form.*
*Truth be told, I’m not up to speed on mid-range SaaS application suite alternatives.
Continuing that thought — if you’re a mid-range application software provider, you have to develop a SaaS version of your product line. That’s a very different business model than the apps + OEMed platform you’re probably providing now, but it’s the best way to serve your customers going forward. And by the way — while mid-range application software is commonly sold on a regional basis, SaaS can be sold more globally; after all, the the need for onsite service is eliminated, and price points should in most cases fit with telephone sales. Yes, national language and regional data privacy rules are both concerns, but they still leave the available markets looking much bigger than regional resellers have traditionally enjoyed. So expect shake-outs in a whole lot of vertical markets, as vendors horn in on each other’s territories, and a few elephantine winners perhaps emerge.
The argument above assumes that extreme reliability is needed. So there’s nothing necessarily wrong with a small team of business analysts sticking an RDBMS appliance* in a corner and managing it themselves. If it sputters from time to time, who cares; using it still may be easier than getting that data in and out of the cloud. But eventually, if all the data is remote anyway — SaaS, website, etc. — then it may make sense to do analytics remotely as well.
*Previously, that appliance might have been from Netezza; now, my first thought is the cheaper — albeit more limited — Infobright.
The arguments that direct smaller companies toward SaaS apply to large enterprises to, but they aren’t as dispositive. Larger enterprises can actually afford to do their own IT operations if they want to. What’s more, moving away from in-house operations is harder for big firms, due to the larger and more customized portfolio of legacy systems they’re likely to have. That said:
- Almost all enterprises should have their internet-facing systems offsite, even if just via co-location. The core reasons are that ingesting high-volume inbound network traffic is inherently difficult, and security issues make it much tougher yet. In addressing these challenges, specialists enjoy significant economies of scale.
- Most enterprises will have plenty of SaaS silos. If nothing else:
- Complex machinery will increasingly “phone home” for help staying in good working order. That’s a form of SaaS.
- Information providers and aggregators tend to deliver via SaaS.
- Various kinds of collaboration and communication apps, from Google Mail to Dropbox, live in the cloud. Personal productivity applications, from word processing to Photoshop, may be following.
- “Rodney Dangerfield” departments — i.e., ones unhappy with the respect and attention they get from central IT — often turn to SaaS or similar outsourcing. Human resources is an obvious example, from Automatic Data Processing to Employease to, these days, Workday.
That leaves us with the questions as to when and how large enterprises should or will move their core applications to SaaS and/or the cloud. Given the length of this post, I won’t try to answer them now. But for starters:
- Enterprises don’t like to rip and replace their apps, except in consolidation projects, as long as they can avoid doing so.
- Cloud/remote computing economies are less convincing if you already have your computer rooms staffed and set up.
- A key benefit of SaaS is that vendors control and drive the upgrade cycles. One cost of that is restrictions on customization, although you can also build apps and app extensions on Paas//DBaaS/Waas (Platform/DataBase/Whatever as a Service) offerings such as force.com.
- Lock-in is a serious concern, for application and platform offerings alike. Not only are you betting on one vendor’s software black box, you’re also betting on its remote computing operation. If you grow dissatisfied with either, or with their pricing, you may not have much opportunity to escape.