Feed aggregator

Things I still believe in

Rob Baillie - Fri, 2018-10-19 09:49
Over 10 years ago I wrote a blog post on things that I believe in - as a developer, and when I re-read it recently I was amazed at how little has changed.

I'm not sure if that's a good thing, or a bad thing - but it's certainly a thing.

Anyway - here's that list - slightly updated for 2018... it you've seen my talk on Unit Testing recently, you might recognise a few entries.

(opinions are my own, yada yada yada)
  • It's easier to re-build a system from its tests than to re-build the tests from their system.

  • You can measure code complexity, adherence to standards and test coverage; you can't measure quality of design.

  • Formal and flexible are not mutually exclusive.

  • The tests should pass, first time, every time (unless you're changing them or the code).

  • Test code is production code and it deserves the same level of care.

  • Prototypes should always be thrown away.

  • Documentation is good, self documenting code is better, code that doesn't need documentation is best.

  • If you're getting bogged down in the process then the process is wrong.

  • Agility without structure is just hacking.

  • Pair programming allows good practices to spread.

  • Pair programming allows bad practices to spread.

  • Team leaders should be inside the team, not outside it.

  • Project Managers are there to facilitate the practice of developing software, not to control it.

  • Your customers are not idiots; they always know their business far better than you ever will.

  • A long list of referrals for a piece of software does not increase the chances of it being right for you, and shouldn't be considered when evaluating it.

  • You can't solve a problem until you know what the problem is. You can't answer a question until the question's been asked.

  • Software development is not complex by accident, it's complex by essence.

  • Always is never right, and never is always wrong.

  • Interesting is not the same as useful.

  • Clever is not the same as right.

  • The simplest thing that will work is not always the same as the easiest thing that will work.

  • It's easier to make readable code correct than it is to make clever code readable.

  • If you can't read your tests, then you can't read your documentation.

  • There's no better specification document than the customer's voice.

  • You can't make your brain bigger, so make your code simpler.

  • Sometimes multiple exit points are OK. The same is not true of multiple entry points.

  • Collective responsibility means that everyone involved is individually responsible for everything.

  • Sometimes it's complex because it needs to be; but you should never be afraid to double check.

  • If every time you step forward you get shot down you're fighting for the wrong army.

  • If you're always learning you're never bored.

  • There are no such things as "Best Practices". Every practice can be improved upon.

  • Nothing is exempt from testing. Not even database upgrades or declarative tools.

  • It's not enough to collect data, you need to analyse, understand and act upon that data once you have it.

  • A long code freeze means a broken process.

  • A test hasn't passed until it has failed.

  • A test that can't fail isn't a test.

  • If you give someone a job, you can't guarantee they'll do it well; If you give someone two jobs you can guarantee they'll do both badly

  • Every meeting should start with a statement on its purpose and context, even if everyone in the meeting already knows.

add_colored_sql

Jonathan Lewis - Fri, 2018-10-19 09:08

The following request appeared recently on the Oracle-L mailing list:

I have one scenario related to capturing of sql statement in history table..  Like dba_hist_sqltext capture the queries that ran for 10 sec or more..  How do I get the sql stmt which took less time say in  millisecond..  Any idea pleae share.

An AWR snapshot captures statements that (a) meet some workload criteria such as “lots of executions” and (b) happen to be in the library cache when the snapshot takes place but if you have some statements which you think are important or interesting enough to keep an eye on that don’t do enough work to meet the normal workload requirements of the AWR snapshots it’s still possible to tell Oracle to capture them by “coloring” them.  (Apologies for the American spelling – it’s necessary to avoid error ‘PLS_00302: component %s must be declared’.)

Somewhere in the 11gR1 timeline the package dbms_workload_repository acquired the following two procedures:


PROCEDURE ADD_COLORED_SQL
 Argument Name                  Type                    In/Out Default?
 ------------------------------ ----------------------- ------ --------
 SQL_ID                         VARCHAR2                IN
 DBID                           NUMBER                  IN     DEFAULT

PROCEDURE REMOVE_COLORED_SQL
 Argument Name                  Type                    In/Out Default?
 ------------------------------ ----------------------- ------ --------
 SQL_ID                         VARCHAR2                IN
 DBID                           NUMBER                  IN     DEFAULT


You have to be licensed to use the workload repository, of course, but if you are you can call the first procedure to mark an SQL statement as interesting, after which its execution statistics will be captured whenever it’s still in the library cache at snapshot time. The second procedure lets you stop the capture – and you will probably want to use this procedure from time to time because there’s a limit (currently 100) to the number of statements you’re allowed to register as colored and if you try to exceed the limit your call will raise Oracle error ORA-13534.


ORA-13534: Current SQL count(100) reached maximum allowed (100)
ORA-06512: at "SYS.DBMS_WORKLOAD_REPOSITORY", line 751
ORA-06512: at line 3

If you want to see the list of statements currently marked as colored then you can query table wrm$_colored_sql, exposed through the views dba_hist_colored_sql and (in 12c) cdb_hist_colored_sql. (Note: I haven’t tested whether the limit of 100 views is per PDB or summed across the entire CDB – and the answer may vary with version of Oracle, of course).


SQL> select * from sys.wrm$_colored_sql;

      DBID SQL_ID             OWNER CREATE_TI
---------- ------------- ---------- ---------
3089296639 aedf339438ww3          1 28-SEP-18

1 row selected.

If you’ve had to color a statement to force the AWR snapshot capture it the statement probably won’t appear in the standard AWR reports; but it will be available to the “AWR SQL” report (which I usually generate from SQL*Plus with a call to $ORACLE_HOME/rdbms/admin/awrsqrpt./sql).

Footnote

If the statement you’re interested in executes very infrequently and often drops out of the library cache before it can be captured in an AWR snapshot then an alternative strategy is to enable system-wide tracing for that statement so that you can capture every execution in a trace file.

 

 

ADF Task Flow Performance Boost with JET UI Shell Wrapper

Andrejus Baranovski - Fri, 2018-10-19 01:09
ADF application with UI Shell and ADF Task Flows rendered in dynamic tabs would not offer instant switch from one tab to another experience. Thats because tab switch request goes to the server and only when browser gets response - tab switch happens. There is more to this - even if tab in ADF is not currently active (tab is disclosed), tab content (e.g. region rendered from ADF Task Flow) still may participate in the request processing. If user opens many tabs, this could result in slightly slower request processing time overall.

ADF allows to render ADF Task Flows directly by accessing them through URL, if it is configured with page support on the root level. ADF Task Flow can be accessed by URL, this means we can include it into iframe. Imagine using iframe for each tab and rendering ADF Task Flows inside. This will enable ADF Task Flow independent processing in each tab, similar to opening them in separate browser tab.

Iframe can be managed in Oracle JET, using plain JavaScript and HTML code. My sample implements dynamic JET tabs with iframe support. Iframe renders ADF Task Flow. While navigating between tabs, I simply hide/show iframes, this allows to keep the state of ADF Task Flow and return to the same state, when opening back the tab. Huge advantage in this case - tab navigation and switching between tabs with ADF Task Flows works very fast - it takes only client time processing. Look at this recorded gif, where I navigate between tabs with ADF content:


Main functions are listed below.

1. Add dynamic iframe. Here we check if frame for given ADF Task Flow is already created, if no we create it and append to HTML element


2. Select iframe, when switching tabs. Hide all frames first, select frame which belongs to the selected tab


3. Remove iframe. Remove frame, when tab is closed


4. Select frame after remove. This method helps to set focus to the next frame, after current tab was removed


We can control when iframe or regular JET module is rendered, by using flag computed function assigned to main div:


In this app I have defined static URL's for displayed ADF Task Flows. Same can be loaded by fetching menu, etc.:


To be able to load ADF Task Flow by URL, make sure to use ADF Task Flow with page (you can include ADF region with fragments into that page). Set url-invoke-allowed property:


This is how it looks like. By default, JET dashboard module is displayed, select item from the menu list to load tab with ADF Task Flow:


JET tab rendering iframe with ADF table:


You can monitor ADF content loading in iframe within JET application:


JET tab rendering iframe with ADF form:


Download sample app from GitHub repository.

How to extract XML data using extract function

Tom Kyte - Thu, 2018-10-18 16:06
inserted row by using below statemet :- <code>insert into xmlt values('<?xml version="1.0"?> <ROWSET> <ROW> <NAME>karthick</NAME> <SALARY>3400</SALARY> </ROW> <ROW> <NAME>c</NAME> <SALARY>1</SALARY> </ROW> <ROW> <NAME>mani</NAME> <SALARY>1</SAL...
Categories: DBA Blogs

Elaborate why 5 & same table used in below query

Tom Kyte - Thu, 2018-10-18 16:06
<code>select distinct * from t t1 where 5 >= ( select count ( distinct t2.sal ) from t t2 where t2.deptno = t1.deptno and t2.sal >= t1.sal );</code> I'll be grateful if you can explain. how the number work ,5, without var...
Categories: DBA Blogs

Firefox Quantum ESR 60 Certified with EBS 12.1 and 12.2 for macOS High Sierra 10.13

Steven Chan - Thu, 2018-10-18 13:04

Mozilla Firefox Quantum Extended Support Release 60 is certified as a macOS-based client browser on macOS High Sierra (macOS 10.13) for Oracle E-Business Suite 12.1 and 12.2.

What Use-cases Are Certified?

Oracle E-Business Suite (EBS) R12 has two interfaces: a web-based (OA Framework/HTML) model for modules like iProcurement and iStore, and Oracle Forms/Java based model for our professional services modules like Oracle Financials.

Firefox Quantum Extended Support Release (ESR) 60.x is now certified with macOS for both web-based and Oracle Forms/Java based models as outlined below.

  • Firefox ESR 60.x is certified for EBS users running web-based (HTML / OA Framework) screens.
  • Firefox ESR 60.x is certified for running Java content in EBS using Java Web Start (JWS) technology.
  • Firefox ESR 60.x is not certified for running Java content in EBS using Java Plug-in technology.

Certified Versions

Oracle E-Business Suite

  • Oracle E-Business Suite 12.2
  • Oracle E-Business Suite 12.1

Desktop Operating System

  • macOS High Sierra (macOS 10.13.3 or higher)

Java Web Start (JWS)

  • JRE 8 Update 171 or higher

While this is the minimum recommended Java release, users are encouraged to upgrade to the latest and therefore most secure Java 8 CPU release available.

Prerequisite Patch Requirements

Running Firefox on macOS using Java Web Start (JWS) with EBS requires additional patching.

For further information on patch and set up requirements see

Implications for Safari 12 on macOS

Customers have been asking about the compatibility of new versions of the following Apple products with Oracle E-Business Suite Releases 12.1 and 12.2:

  • Safari 12 (works with macOS 10.13.6 and 10.12.6)
  • macOS Mojave (macOS 10.14)

Neither of these two products have been certified with either EBS 12.1 or EBS 12.2 as of October 18, 2018.  

Safari 12 is unable to launch Java in the way that prior Safari versions could. This will prevent E-Business Suite 12.1 and 12.2 customers from running Forms-based products. Therefore, customers should *NOT* upgrade to Safari 12 on macOS desktop platforms.

macOS Mojave (macOS 10.14) includes Safari 12. Customers should *NOT* upgrade to macOS Mojave.

What changed in Safari 12?

Safari 12 introduces an important change: it removes support for “legacy NPAPI plug-ins”. This affects all EBS releases. macOS Mojave includes Safari 12.

Some products within Oracle EBS 12.1 and 12.2 run via HTML in browsers. These products are sometimes called “self-service web applications”. They are expected to run without issue in Safari 12, but our certification testing is still underway.

Some products within Oracle EBS 12.1 and EBS 12.2 use Oracle Forms. Oracle Forms requires Java for desktop clients. On the macOS desktop platform, the only certified option today for launching Java is via the JRE plugin via the NPAPI approach.

This means that Safari 12 and macOS Mojave (macOS 10.4) will be unable to use the current JRE plugin-based launching technology for Java and Forms for EBS desktop users.

Recommendations for EBS customers on macOS platforms

As of today, the latest certified versions of Safari and macOS are:

  • Safari 11 (works with macOS 10.13)
  • macOS High Sierra (macOS 10.13)

EBS customers should use only certified configurations. EBS customers who use Forms-based products should avoid upgrading to Safari 12 or macOS Mojave today.

EBS customers who have upgraded to Safari 12 on macOS 10.13 can use Firefox ESR 60 to run Forms-based products via the Java Web Start technology.

What is Mozilla Firefox ESR?

Mozilla offers an Extended Support Release based on an official release of Firefox for organizations that are unable to mass-deploy new consumer-oriented versions of Firefox every six weeks.  For more details about Firefox ESR, see the Mozilla ESR FAQ.

E-Business Suite certified with Firefox Extended Support Releases Only

New personal versions of Firefox on the Rapid Release channel are released roughly every six weeks.  It is impractical for us to certify these new personal Rapid Release versions of Firefox with the Oracle E-Business Suite because a given Firefox release is generally obsolete by the time we complete the certification.

From Firefox 10 and onwards, Oracle E-Business Suite is certified only with selected Firefox Extended Support Release versions. Oracle has no current plans to certify new Firefox personal releases on the Rapid Release channel with the E-Business Suite.

Plug-in Support removed in Firefox ESR 60

Mozilla has removed plug-in support in Firefox ESR 60. This means Firefox ESR 60 cannot run Forms-based content in EBS using the Java plugin method. 

If your Firefox ESR 60 end-users run Forms-based content in EBS, you must switch from the JRE plugin to Java Web Start

EBS patching policy for Firefox compatibility issues

Mozilla stresses their goal of ensuring that Firefox personal versions will continue to offer the same level of application compatibility as Firefox Extended Support Releases. 

Oracle E-Business Suite Development will issue new E-Business Suite patches or workarounds that can be reproduced with Firefox Extended Support Releases.  If you report compatibility issues with Firefox personal releases that cannot be reproduced with Firefox Extended Support Releases, your options are:

  1. Deploy a certified Firefox Extended Support Release version instead of the Firefox personal version
  2. Report the incompatibility between Firefox ESR and Firefox personal to Mozilla
  3. Use Internet Explorer (on Windows) or Safari (on Mac OS X) until Mozilla resolves the issue

EBS Compatibility with Firefox ESR security updates

Mozilla may release new updates to Firefox ESR versions to address high-risk/high-impact security issues.  These updates are considered to be certified with the E-Business Suite on the day that they're released.  You do not need to wait for a certification from Oracle before deploying these new Firefox ESR security updates.

References

Related Articles

Categories: APPS Blogs

Monitoring Linux With Nmon

Yann Neuhaus - Thu, 2018-10-18 09:37

I was looking for tools to monitor linux servers and I found an interesting one nmon ( short for Nigel’s Monitor). I did some tests. In this blog I am describing how to install nmon and how we can use it
I am using a Oracle Enterprise Linux System.

[root@condrong nmon]# cat /etc/issue
Oracle Linux Server release 6.8
Kernel \r on an \m

[root@condrong nmon]#

For the installation I used the repository epel

wget http://dl.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
rpm -ivh epel-release-6-8.noarch.rpm 
yum search nmon
yum install nmon.x86_64

Once installed, the tool is launched by just running the nmon command

[root@condrong nmon]# nmon

nmon1

If we type c we have CPU statistics
nmon2
If we type m we have memory statistics
nmon3
If we type t we can see Top Processes and so on
nmon4

nmon can be also scheduled. The data are collected in a file and this file can be analyzed later. For this we can use following options

OPTIONS
       nmon follow the usual GNU command line syntax, with long options starting
       with  two  dashes  (‘-’).   nmon  [-h] [-s ] [-c ] [-f -d
        -t -r ] [-x] A summary of options is included below.

       -h     FULL help information

              Interactive-Mode: read startup banner and type:  "h"  once  it  is
              running For Data-Collect-Mode (-f)

       -f            spreadsheet output format [note: default -s300 -c288]
              optional

       -s   between refreshing the screen [default 2]

       -c    of refreshes [default millions]

       -d     to increase the number of disks [default 256]

       -t            spreadsheet includes top processes

       -x            capacity planning (15 min for 1 day = -fdt -s 900 -c 96)

In my example I just create a file my_nmon.sh and execute the script

[root@condrong nmon]# cat my_nmon.sh 
#! /bin/bash
nmon -f -s 60 -c 30

[root@condrong nmon]# chmod +x my_nmon.sh 
[root@condrong nmon]# ./my_nmon.sh

Once executed, the script will create a file in the current directory with an extension .nmon

[root@condrong nmon]# ls -l *.nmon
-rw-r--r--. 1 root root 55444 Oct 18 09:51 condrong_181018_0926.nmon
[root@condrong nmon]#

To analyze this file, we have many options. For me I downloaded the nmon_analyzer
This tool works with Excel 2003 on wards and supports 32-bit and 64-bit Windows.
After copying my nmon output file in my windows station, I just have to launch the excel file and then use the button Analyze nmon data
nmon5
And below I show some graphs made by the nmon_analyzer
nmon6

nmon7

nmon8

Conclusion
As we can see nmon is a very useful tool which can help monitoring our servers. It works also for Aix systems.

Cet article Monitoring Linux With Nmon est apparu en premier sur Blog dbi services.

[BLOG] Oracle WebLogic Administration: Machine and Node Manager

Online Apps DBA - Thu, 2018-10-18 07:21

Do you want to enhance your knowledge of Weblogic Administration and want to know what are machine and node managers in Weblogic Server? Visit: http://bit.ly/2EvGFEs to go through the blog which covers: ✔What is a Node Manager and what are its requirements ✔How to Start & Stop the Node Manager? ✔What is Machine in Weblogic […]

The post [BLOG] Oracle WebLogic Administration: Machine and Node Manager appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Oracle Delivers the Trifecta of Retail Insight with New Cloud Service

Oracle Press Releases - Thu, 2018-10-18 07:00
Press Release
Oracle Delivers the Trifecta of Retail Insight with New Cloud Service Oracle Retail Insights Cloud Service Suite Delivers Descriptive, Prescriptive and Predictive Analytics to the Retail Enterprise

Redwood Shores, Calif.—Oct 18, 2018

Oracle Retail has combined three cloud services into a new Oracle Retail Insights Cloud Service Suite. By combining existing science and insight cloud services, Oracle can provide a spectrum of analytics that align to key performance indicators for the retail community. These metrics render in a beautiful user experience with dashboards organized by persona and organizational responsibilities in Oracle Retail Home to encourage more strategic decisions that drive growth and operational efficiency. Oracle Retail customers including Gap Inc., Lojas Renner and Al Nahdi have already experienced the benefits of Oracle Retail Insights and Science solutions and continue to inform their strategic decisions with in-depth insights and science-enabled analytics.

"The Advanced and Predictive Analytics software market, which in 2017 reached $3.1 billion worldwide, is expected to grow at a five-year CAGR of 9.4%. Sophisticated analytical techniques are being embedded into more and more applications," said Chandana Gopal, Research Manager, Analytics and Information Management, IDC. "Forward-looking analytics is going to become much more mainstream, as enterprises are able to harness more and more data from a variety of sources."

“We are working with several retailers who are anxious to adopt cloud to bridge the gap between operations and innovations,” said Jeff Warren, Vice President, Oracle Retail. “To capitalize on the surge of unstructured and structured data in retail, we have applied advanced techniques for analyzing retail data from multiple perspectives into a single cloud services suite that integrates with retail-rich applications and cloud services. With these tools we can deliver analysis on what happened (descriptive), what is going to happen (predictive) and what a retailer should do about it going forward (prescriptive).”

The Trifecta: A Powerful Adaptive Intelligence Suite for the Entire Retail Enterprise

With the new Oracle Retail Insights Cloud Service Suite retail organizations can experience benefits including:

  • Enhanced User Experience and Relevance: The cloud suite leverages Oracle Retail Home to provide a single and modern access point to the data. The user experience streamlines and simplifies access to data and applications to provide relevant and actionable information based on roles and responsibilities. The federated user interfaces support integrated insights-to-action loops.

  • Speed to Value: With one rapidly-deployed cloud service, the solution represents the application of Oracle's analytical core to modern retailing: a comprehensive big data warehouse founded on industry best practices and the scalability, reliability, and economy of a complete Oracle analytic tech stack in the Oracle Cloud.

  • Better Understanding of Customer Context: Gain a better understanding of who your customers are, how they behave and why, so you can make the more intelligent product and promotion decisions. Leverage complete visibility into what motivates customers at each stage of their journey and how they are interacting with your brand across all touchpoints.

  • Uncover Merchandising Intelligence: Identify actionable merchandising opportunities across touchpoints, including backorder and returns, top/bottom seller, demand/fulfillment and price and promotion analysis.

  • Inspire Customer Loyalty: Leverage a highly visual, intuitive, end-to-end workflow to define and execute local market assortments, improve conversion of traffic into sales, and increase customer satisfaction.

  • Leverage Artificial Intelligence and Machine Learning: Retail business users can conduct advanced analyses to understand better and optimize affinity, store clustering, customer segmentation, consumer decision trees, demand transference, and attribute extraction.

  • Unleash the Power of Flexibility and Ad Hoc Reporting: Business analysts and data science teams can leverage innovation workbench for additional ad hoc analysis.

  • Leverage Common Foundational Data Architecture: The suite can exploit the logical value of the data generated by Oracle Retail's comprehensive application footprint, and surfaces properly-filtered and secured descriptive, predictive and prescriptive analytics to whomever, however, whenever and wherever desired.

  • Drive Retail Investment: Optimize assortments to available space to maximize planogram performance, return-on-space, sales, revenue, and profits, while improving customer satisfaction with the optimal variety for each store.

  • Improve Gross Margin: Drive optimal recommendations for promotions, markdowns, and targeted offers that maximize profits and sell through leveraging prescriptive analytics.

Contact Info
Matt Torres
Oracle
14155951584
matt.torres@oracle.com
About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

About Oracle Retail

Oracle provides retailers with a complete, open, and integrated suite of best-of-breed business applications, cloud services, and hardware that are engineered to work together and empower commerce. Leading fashion, grocery, and specialty retailers use Oracle solutions to anticipate market changes, simplify operations and inspire authentic brand interactions. For more information, visit our website at www.oracle.com/retail.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Talk to a Press Contact

Matt Torres

  • 14155951584

Find if a string is Upper, Lower or Mixed Case, numeric, Alpha Numeric etc

Tom Kyte - Wed, 2018-10-17 21:46
Dear Experts, I populated a table with few rows of strings that are Upper/ Lower/ Mixed case, alpha-numeric, numeric etc. 1. Now I would like to evaluate they type of string using a case statement. I tried using regexp_like, but it fails when ...
Categories: DBA Blogs

Deadlock issue came while using set based sql

Tom Kyte - Wed, 2018-10-17 21:46
Hi Tom, We are using set based sql in my process, In that we are creating so many GTT tables in a package. And we are executing this package concurrently in more than ten sessions, these sessions will create temporary tables with different name a...
Categories: DBA Blogs

Foreign Keys with default values

Tom Kyte - Wed, 2018-10-17 21:46
Hello. I'm designing a database in Oracle 12.2 in Toad Data Modeler. It would get lots of inserts. I'm using identity columns as PK (basically I create a sequence and use it as default value in the column, sequence.nextval). When I connect the...
Categories: DBA Blogs

Migration of 6i Forms to APEX

Tom Kyte - Wed, 2018-10-17 21:46
Hi Team, I am trying to migrate forms 6i to APEX, but problem that i pose here is that i cannot completely migrate all the functionalities of my forms to Apex even after trying to correct Metadata it does not migrate forms completely. So, my q...
Categories: DBA Blogs

Critical Patch Update for October 2018 Now Available

Steven Chan - Wed, 2018-10-17 11:57

The Critical Patch Update (CPU) for October 2018 was released on 16 October 2018. Oracle strongly recommends applying the patches as soon as possible.

The Critical Patch Update Advisory is the starting point for relevant information. It includes a list of products affected, pointers to obtain the patches, a summary of the security vulnerabilities, and links to other important documents. 

Supported products not listed in the "Supported Products and Components Affected" Section of the advisory do not require new patches to be applied.

The Critical Patch Update Advisory is available at the following location:

It is essential to review the Critical Patch Update supporting documentation referenced in the Advisory before applying patches.

The next four Critical Patch Update release dates are:

  • 15 January 2019
  • 16 April 2019
  • 16 July 2019
  • 15 October 2019
References Related Articles
Categories: APPS Blogs

Problem Solving

Jonathan Lewis - Wed, 2018-10-17 10:11

Here’s a little question that popped up on the Oracle-L list server a few days ago:

I am facing this issue running this command in 11.2.0.4.0 (also in 12c R2 I got the same error)

SQL> SELECT TO_TIMESTAMP('1970-01-01 00:00:00.0','YYYY-MM-DD HH24:MI:SS.FF') + NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL;
SELECT TO_TIMESTAMP('1970-01-01 00:00:00.0','YYYY-MM-DD HH24:MI:SS.FF') + NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL
ORA-01873: a precisão precedente do intervalo é pequena demais

 

How do you go about finding out what’s going on ? In my case the first thing is to check the translation the error message (two options):

SQL> execute dbms_output.put_line(sqlerrm(-1873))
ORA-01873: the leading precision of the interval is too small

SQL> SELECT TO_TIMESTAMP('1970-01-01 00:00:00.0','YYYY-MM-DD HH24:MI:SS.FF') + NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL;
SELECT TO_TIMESTAMP('1970-01-01 00:00:00.0','YYYY-MM-DD HH24:MI:SS.FF') + NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL
                                                                                                       *
ERROR at line 1:
ORA-01873: the leading precision of the interval is too small

That didn’t quite match my guess, but it was similar, I had been guessing that it was saying something about precision – but it doesn’t really strike me as an intuitively self-explanatory message, so maybe a quick check in $ORACLE_HOME/rdbms/mesg/oraus.msg to find the error number with cause and action will help:


01873, 00000, "the leading precision of the interval is too small"
// *Cause: The leading precision of the interval is too small to store the
//  specified interval.
// *Action: Increase the leading precision of the interval or specify an
//  interval with a smaller leading precision.

Well, that doesn’t really add value – and I can’t help feeling that if the leading precision of the interval is too small it won’t help to make it smaller. So all I’m left to go on is that there’s a precision problem of some sort and it’s something to do with the interval, and probably NOT with adding the interval to the timestamp. So let’s check that bit alone:


SQL> SELECT NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL;
SELECT NUMTODSINTERVAL(2850166802000/1000, 'SECOND') FROM DUAL
                                    *
ERROR at line 1:
ORA-01873: the leading precision of the interval is too small


So the interval bit is the problem. Since the problem is about “precision”, let’s try messing about with the big number. First I’ll do a bit of cosmetic tidying by doing the division to knock off the trailing zeros, then I’ll see what happens when I divide by 10:

SQL> SELECT NUMTODSINTERVAL(285016680, 'SECOND') from dual;

NUMTODSINTERVAL(285016680,'SECOND')
---------------------------------------------------------------------------
+000003298 19:18:00.000000000

So 285 million works, but 2.85 billion doesn’t. The value that works give an interval of about 3,298 days, which is about 10 years, so maybe there’s an undocumented limit of 100 years on the input value; on the other hand the jump from 285 million to 2.85 billion does take you through a critical computer-oriented limit: 231 – 1, the maximum signed 32 bit integer (2147483647) so lets try using that value, and that value plus 1 in the expression:


SQL> SELECT NUMTODSINTERVAL(power(2,31), 'SECOND') from dual;
SELECT NUMTODSINTERVAL(power(2,31), 'SECOND') from dual
                       *
ERROR at line 1:
ORA-01873: the leading precision of the interval is too small


SQL> SELECT NUMTODSINTERVAL(power(2,31)-1, 'SECOND') from dual;

NUMTODSINTERVAL(POWER(2,31)-1,'SECOND')
---------------------------------------------------------------------------
+000024855 03:14:07.000000000

1 row selected.

Problem identified – it’s a numeric limit of the numtodsinterval() function. Interestingly it’s not documented in the Oracle manuals, in fact the SQL Reference manual suggests that this shouldn’t be a limit because it says that “any number value or anything that can be cast as a number is legal” and in Oracle-speak a number allows for roughly 38 digits precision.

Whilst we’ve identified the problem we still need a way to turn the input number into the timestamp we need – the OP didn’t need help with that one: divide by sixty and convert using minutes instead of seconds:


SQL> SELECT TO_TIMESTAMP('1970-01-01 00:00:00.0','YYYY-MM-DD HH24:MI:SS.FF') + NUMTODSINTERVAL(2850166802000/1000/60, 'MINUTE') FROM DUAL;

TO_TIMESTAMP('1970-01-0100:00:00.0','YYYY-MM-DDHH24:MI:SS.FF')+NUMTODSINTER
---------------------------------------------------------------------------
26-APR-60 01.00.02.000000000 AM

1 row selected

Job done.

Oracle Buys goBalto

Oracle Press Releases - Wed, 2018-10-17 07:00
Press Release
Oracle Buys goBalto Adds Leading Solution for Accelerating Clinical Trial Site Selection and Activation to Oracle Health Sciences Cloud

Redwood Shores, Calif.—Oct 17, 2018

Oracle today announced that it has entered into an agreement to acquire goBalto, which delivers leading cloud solutions to accelerate clinical trials by streamlining and automating the selection and set up of the best performing clinical research sites to conduct trials.

goBalto’s study startup solutions are activated at over 90,000 research sites across 2,000+ studies in over 80 countries to deliver significant savings to customers with over 30 percent quantifiable reduction in study startup cycle times.

Today, Oracle Health Sciences offers customers the industry's most advanced cloud solution for clinical trial planning, data collection, trial execution and safety management. goBalto adds the leading industry cloud solution that significantly reduces clinical trial startup time.  Together, Oracle and goBalto will provide the most complete end-to-end cloud platform dedicated to unifying action and accelerating results for the Life Sciences industry. 

“Clinical trial site selection and activation is one of the most manual and time-consuming processes for our customers,” said Steve Rosenberg, Senior Vice President and General Manager of Oracle Health Sciences Global Business Unit. “Oracle Health Sciences is designed to provide the industry with the best end-to-end clinical trial experience and the addition of goBalto will further allow our customers to remove another barrier from delivering treatments to patients faster.”

“We set out on a mission to streamline the clinical trial study startup process ten years ago because we saw how untenable it was for pharmaceutical companies and contract research organizations to track 1,000+ sites by 1,000+ specialists on spreadsheets,” said Jae Chung, Founder and President of goBalto.  

“We are delighted to join forces with Oracle as the benefits offered to both our customers and employees as a broader clinical trial continuum are unparalleled in the industry,” said Sujay Jadhav, CEO of goBalto.

More information about this announcement is available at www.oracle.com/gobalto.

Contact Info
Deborah Hellinger
Oracle Corporate Communications
+1.212.508.7935
deborah.hellinger@oracle.com
Ken Bond
Oracle Investor Relations
+1.650.607.0349
ken.bond@oracle.com
About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

Oracle is currently reviewing the existing goBalto product roadmap and will be providing guidance to customers in accordance with Oracle’s standard product communication policies. Any resulting features and timing of release of such features as determined by Oracle’s review of goBalto’s product roadmap are at the sole discretion of Oracle. All product roadmap information, whether communicated by goBalto or by Oracle, does not represent a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. It is intended for information purposes only, and may not be incorporated into any contract.

Cautionary Statement Regarding Forward-Looking Statements
This document contains certain forward-looking statements about Oracle and goBalto, including statements that involve risks and uncertainties concerning Oracle’s proposed acquisition of goBalto, anticipated customer benefits and general business outlook. When used in this document, the words “anticipates”, “can”, “will”, “look forward to”, “expected” and similar expressions and any other statements that are not historical facts are intended to identify those assertions as forward-looking statements. Any such statement may be influenced by a variety of factors, many of which are beyond the control of Oracle or goBalto, that could cause actual outcomes and results to be materially different from those projected, described, expressed or implied in this document due to a number of risks and uncertainties. Potential risks and uncertainties include, among others, the possibility that the transaction will not close or that the closing may be delayed, the anticipated synergies of the combined companies may not be achieved after closing, the combined operations may not be successfully integrated in a timely manner, if at all, general economic conditions in regions in which either company does business may deteriorate and/or Oracle or goBalto may be adversely affected by other economic, business, and/or competitive factors. Accordingly, no assurances can be given that any of the events anticipated by the forward-looking statements will transpire or occur, or if any of them do so, what impact they will have on the results of operations or financial condition of Oracle or goBalto. You are cautioned to not place undue reliance on forward-looking statements, which speak only as of the date of this document. Neither Oracle nor goBalto is under any duty to update any of the information in this document.

Talk to a Press Contact

Deborah Hellinger

  • +1.212.508.7935

Ken Bond

  • +1.650.607.0349

[BLOG] Oracle Critical Patch Update October 2018 Now Available

Online Apps DBA - Wed, 2018-10-17 06:32

Do you know that Oracle has released Critical Patch Update (CPU) for October 2018 with wide-ranging security update? Visit: https://k21academy.com/appsdba36 to check: ✔Affected Products and Patch Information ✔Doc to Refer to Apply CPU October patches & much more… Do you know that Oracle has released Critical Patch Update (CPU) for October 2018 with wide-ranging security […]

The post [BLOG] Oracle Critical Patch Update October 2018 Now Available appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Connect to DV Datasets and explore many more new features in OAC / OAAC 18.3.3.0

Tim Dexter - Wed, 2018-10-17 05:26

Greetings !

Oracle Analytics Cloud (OAC) and Oracle Autonomous Analytics Cloud (OAAC) version 18.3.3.0 (also known as V5) got released last month. A rich set of new features have been introduced in this release across different products (with product version 12.2.5.0.0) in the suite. You can check all the new features of OAC / OAAC in the video here.

The focus for BI Publisher on OAC / OAAC in this release has been to compliment Data Visualization for pixel perfect reporting, performance optimizations and adding self service abilities. Here is a list of new features added this release:

BI Publisher New Features in OAC V5.0

New Feature Description 1. DV Datasets

Now you can leverage a variety of data sources covered by Data Visualization data sets, including Cloud based data sources such as Amazon Redshift, Autonomous Data Warehouse Cloud; Big Data sources such as Spark, Impala, Hive; and Application data sources such as Salesforce, Oracle Applications etc. BI Publisher is here to compliment DV to create pixel perfect reports using DV datasets.

Check the documentation for additional details. Also, check this video to see how this feature works.

2. Upload Center

Now upload all files for custom configuration such as fonts, ICC Profile, Private Keys, Digital Signature etc.from the Upload Center as a self service feature available in the Administration page.

Additional details can be found in the documentation here.

3. Validate Data Model

Report Authors can now validate a data model before deploying the report in a production environment. This will help during a custom data model creation where data sets, LOVs and Bursting Queries can be validated against standard guidelines to avoid any undesired performance impact to the report server. 

Details available here.

4. Skip unused data sets

When a data model contains multiple data sets for different layouts, each layout might not use all the data sets defined in the data model. Now Report Authors can select data model property to skip the execution of the unused data sets in a layout. Setting this property reduces the data extraction time, memory usage and improves overall report performance.

Additional details can be found here.

5. Apply Digital Signature to PDF Documents

Digital Signature is widely used feature in on-prem deployments and now this has been added in OAC too, where in Digital Signature can be applied to a PDF output. Digital Signatures can be uploaded from the Upload Center, required signature can be selected under security center, and then applied to PDF outputs by configuring attributes under report properties or run-time properties. 

You can find the documentation here. Also check this video for a quick demonstration.

6. Password protect MS Office Outputs - DocX, PPTX, XLSX

Now protect your MS Office output files with a password defined at report or server level.

Check the PPTX output properties, DocX output properties, Excel 2007 output properties

7. Deliver reports in compressed format

You can select this option to compress the output by including the file in a zip file before delivery via email, FTP, etc.

Additional details can be found here.

8. Request read-receipt and delivery confirmation notification 

You can opt to get delivery and read-receipt notification for scheduled job delivery via email.

Check documentation for additional details. 

9. Add scalability mode for Excel Template to handle large data size

Now you can set up scalability mode for an excel template. This can be done at system level, report level or at template level. By setting this attribute to true, the engine will flush memory after a threshold value and when the data exceeds 65K rows it will rollover data into multiple sheets.

You can find the documentation here.

 

Stay tuned to hear more updates on features and functionalities ! Happy BIP'ing ...

 

Categories: BI & Warehousing

Fixing* Baseline Validation Tool** Using Network Sniffer

Rittman Mead Consulting - Wed, 2018-10-17 05:22

* Sort of
** Not exactly

In the past, Robin Moffatt wrote a number of blogs showing how to use various Linux tools for diagnosing OBIEE and getting insights into how it works (one, two, three, ...). Some time ago I faced a task which allowed me to continue Robin's cycle of posts and show you how to use Wireshark to understand how a certain Oracle tool works and how to search for the solution of a problem more effectively.

To be clear, this blog is not about the issue itself. I could simply write a tweet like "If you faced issue A then patch B solves it". The idea of this blog is to demonstrate how you can use somewhat unexpected tools and get things done.

Obviously, my way of doing things is not the only one. If you are good in searching at My Oracle Support, you possibly can do it even faster, but what is good about my way (except for it is mine, which is enough for me) is that it doesn't involve uneducated guessing. I do an observation and get a clarified answer.

Most of my blogs have disclaimers. This one is not an exception, while its disclaimer is rather small. There is still no silver bullet. This won't work for every single problem in OBIEE. I didn't say this.

Now, let's get started.

The Task

The problem was the following: a client was upgrading its OBIEE system from 11g to 12c and obviously wanted to test for regression, making sure that the upgraded system worked exactly the same as the old one. Manual comparison wasn't an option since they have hundreds or even thousands of analyses and dashboards, so Oracle Baseline Validation Tool (usually called just BVT) was the first candidate as a solution to automate the checks.

Using BVT is quite simple:

  • Create a baseline for the old system.
  • Upgrade
  • Create a new baseline
  • Compare them
  • ???
  • Profit! Congratulations. You are ready to go live.

Right? Well, almost. The problem that we faced was that BVT Dashboards plugin for 11g (a very old 11.1.1.7.something) gave exactly what was expected. But for 12c (12.2.1.something) we got all numbers with a decimal point even while all analyses had "no decimal point" format. So the first feeling we got at this point was that BVT doesn't work well for 12c and that was somewhat disappointing.

SPOILER That wasn't true.

I made a simple dashboard demonstrating the issue.

OBIEE 11g

11g-dash-vs-bvt
Measure values in the XML produced by BVT are exactly as on the dashboard. Looks good.

OBIEE 12c

12c-dash-vs-bvt-1
Dashboard looks good, but values in the XML have decimal digits.

failed

As you can see, the analyses are the same or at least they look very similar but the XMLs produced by BVT aren't. From regression point of view this dashboard must get "DASHBOARDS PASSED" result, but it got "DASHBOARDS DIFFERENT".

Reading the documentation gave us no clear explanation for this behaviour. We had to go deeper and understand what actually caused it. Is it BVT screwing up the data it gets from 12c? Well, that is a highly improbable theory. Decimals were not simply present in the result but they were correct. Correct as in "the same as stored in the database", we had to reject this theory.
Or maybe the problem is that BVT works differently with 11g and 12c? Well, this looks more plausible. A few years have passed since 11.1.1.7 was released and it would not be too surprising if the old version and the modern one had different APIs used by BVT and causing this problem. Or maybe the problem is that 12c itself ignores formatting settings. Let's find out.

The Tool

Neither BVT, nor OBIEE logs gave us any insights. From every point of view, everything was working fine. Except that we were getting 100% mismatch between the source and the target. My hypothesis was that BVT worked differently with OBIEE 11g and 12c. How can I check this? Decompiling the tool and reading its code would possibly give me the answer, but it is not legal. And even if it was legal, the latest BVT size is more than 160 megabytes which would give an insane amount of code to read, especially considering the fact I don't actually know what I'm looking for. Not an option. But BVT talks to OBIEE via the network, right? Therefore we can intercept the network traffic and read it. Shall we?

There are a lot of ways to do it. I work with OBIEE quite a lot and Windows is the obvious choice for my platform. And hence the obvious tool for me was Wireshark.

Wireshark is the world’s foremost and widely-used network protocol analyzer. It lets you see what’s happening on your network at a microscopic level and is the de facto (and often de jure) standard across many commercial and non-profit enterprises, government agencies, and educational institutions. Wireshark development thrives thanks to the volunteer contributions of networking experts around the globe and is the continuation of a project started by Gerald Combs in 1998.

What this "About" doesn't say is that Wireshark is open-source and free. Which is quite nice I think.

Installation Details

I'm not going to go into too many details about the installation process. It is quite simple and straightforward. Keep all the defaults unless you know what you are doing, reboot if asked and you are fine.

If you've never used Wireshark or analogues, the main question would be "Where to install it?". The answer is pretty simple - install it on your workstation, the same workstation where BVT is installed. We're going to intercept our own traffic, not someone else's.

A Bit of Wireshark

Before going to the task we want to solve let's spend some time familiarizing with Wireshark. Its starting screen shows all the network adapters I have on my machine. The one I'm using to connect to the OBIEE servers is "WiFi 2".

Screenshot-2018-10-09-13.50.44

I double-click it and immediately see a constant flow of network packets flying back and forth between my computer and local network machines and the Internet. It's a bit hard to see any particular server in this stream. And "a bit hard" is quite an understatement, to be honest, it is impossible.

wireshark

I need a filter. For example, I know that my OBIEE 12c instance IP is 192.168.1.226. So I add ip.addr==192.168.1.226 filter saying that I only want to see traffic to or from this machine. Nothing to see right now, but if I open the login page in a browser, for example, I can see traffic between my machine (192.168.1.25) and the server. It is much better now but still not perfect.

Screenshot-2018-10-09-14.08.52

If I add http to the filter like this http and ip.addr==192.168.1.226, I definitely can get a much more clear view.

For example, here I opened http://192.168.1.226:9502/analytics page just like any other user would do. There are quite a lot of requests and responses. The browser asked for /analytics URL, the server after a few redirects replied what the actual address for this URL is login.jsp page, then browser requested /bi-security-login/login.jsp page using GET method and got the with HTTP code 200. Code 200 shows that there were no issues with the request.

startpage

Let's try to log in.

login

The top window is a normal browser and the bottom one is Wireshark. Note that my credentials been sent via clear text and I think that is a very good argument in defence of using HTTPS everywhere.

That is a very basic use of Wireshark: start monitoring, do something, see what was captured. I barely scratched the surface of what Wireshark can do, but that is enough for my task.

Wireshark and BVT 12c

The idea is quite simple. I should start capturing my traffic then use BVT as usual and see how it works with 12c and then how it works with 11g. This should give me the answer I need.

Let's see how it works with 12c first. To make things more simple I created a catalogue folder with just one analysis placed on a dashboard.

bvt-dashboard-1

It's time to run BVT and see what happens.

Screenshot-2018-10-11-17.49.59

Here is the dataset I got from OBIEE 12c. I slightly edited and formatted it to make easier to read, but didn't change anything important.

dataset12--1

What did BVT do to get this result? What API did it use? Let's look at Wireshark.

Screenshot-2018-10-11-19.09.27

First three lines are the same as with a browser. I don't know why it is needed for BVT, but I don't mind. Then BVT gets WSDL from OBIEE (GET /analytics-ws/saw.dll/wsdl/v6/private). There are multiple pairs of similar query-response flying back and forth because WSDL is big enough and downloaded in chunks. A purely technical thing, nothing strange or important here.
But now we know what API BVT uses to get data from OBIEE. I don't think anyone is surprised that it is Web Services API. Let's take a look at Web Services calls.

First logon method from nQSessionService. It logs into OBIEE and starts a session.

Screenshot-2018-10-11-19.36.59

Next requests get catalogue items descriptions for objects in my /shared/BVT folder. We can see a set of calls to webCatalogServce methods. These calls are reading my web catalogue structure: all folders, subfolders, dashboard and analysis. Pretty simple, nothing really interesting or unexpected here.

ws01

Then we can see how BVT uses generateReportSQLResult from reportService to get logical SQL for the analysis.

Screenshot-2018-10-11-19.42.07

And gets analysis' logical SQL as the response.

Screenshot-2018-10-11-19.45.10

And the final step - BVT executes this SQL and gets the data. Unfortunately, it is hard to show the data on a screenshot, but the line starting with [truncated] is the XML I showed before.

Screenshot-2018-10-12-12.19.58

And that's all. That's is how BVT gets data from OBIEE.

I did the same for 11g and saw absolutely the same procedure.

Screenshot-2018-10-11-21.01.35

My initial theory that BVT may have been using different APIs for 11g and 12c was busted.

From my experiment, I found out that BVT used xmlViewService to actually get the data. And also I know now that it uses logical SQL for getting the data. Looking at the documentation I can see that xmlViewService has no options related to any formatting. It is a purely data-retrieval service. It can't preserve any formatting and supposed to give only the data. But hey, I've started with the statement "11g preserves formatting", how is that possible? Well, that was a simple coincidence. It doesn't.

In the beginning, I had very little understanding of what keywords to use on MoS to solve the issue. "BVT for 12c doesn't preserve formatting"? "BVT decimal part settings"? "BVT works differently for 11g and 12c"? Now I have something much better - "executeSQLQuery decimal". 30 seconds of searching and I know the answer.

mos-1

This was fixed in 11.1.1.9, but there is a patch for 11.1.1.7.some_of_them. The patch fixes an 11g issue which prevents BVT from getting decimal parts of numbers.

pass

As you may have noticed I had no chance of finding this using my initial problem description. Nether BVT, nor 12g or 11.1.1.7 were mentioned. This thread looks completely unrelated to the issue, I had zero chances to find it.

Conlusion

OBIEE is a complex software and solving issues is not always easy. Unfortunately, no single method is enough for solving all problems. Usually, log files will help you. But when something works but not the way you expect, log files can be useless. In my case BVT was working fine, 11g was working fine, 12c was working fine too. Nothing special to write to logs was happening. That is why sometimes you may need unexpected tools. Just like this. Thanks for reading!

Categories: BI & Warehousing

Error while opening Database

Tom Kyte - Wed, 2018-10-17 03:26
HI,THERE I HAVE A SITUATION HERE,DUE TO POWER OUTAGE, DATABASE KEEP GIVING ERROR <code> <code> select name,open_mode from v$database; NAME OPEN_MODE --------- -----------------...
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator