• Tame Big Data using Oracle Data Integration

    This demo illustrates how you can move and transform all your data using Oracle Data Integration - whether that data resides in Oracle Database, Hadoop, third-party databases, applications, files, or a combination of these sources. The "Design once, run anywhere" paradigm allows you to focus on the logical rules of data transformation and movement while the choice of implementation is separate and can evolve as use cases change and new technologies become available.
    The demo highlights Oracle Data Integrator and Oracle GoldenGate integrating with Oracle Loader for Hadoop, Oracle Big Data SQL, Hive, Sqoop, Flume, Pig, HDFS.

    Duration: 60 minutes

    This content is intended for the following job role(s): Data Scientist, Database Designer, Data Warehouse Developer, Data Warehouse Administrator

    Release Date: 25-SEP-2014

Reviews (6)

  • 7.7 years ago
    If you are using Avro target format, then it does not accept ‘$’ characters in the object name. So you need to manually configure the following in the properties file to remove the ‘$’ from the object name. Here is an example: gg.schemareplaceregex=(.+)\\$ gg.schemareplacestring=$1
    • 7.9 years ago
      I tried to run thru step Ingest Change Data using Oracle GoldenGate for Big Data in Part 1 - Ingest Change Data using Oracle GoldenGate for Big Data but seems it failed to capture the new records...please have any advice for me .. Thanks,
      • 8.7 years ago
        Hello experts, I'm trying to load an oracle table into an hive table, using LKM SQL to Hive Sqoop. The problem is that I'm getting an error when this Oracle table is written into Hive because one of the columns is in RAW data type and ODI/LKM doesn't make a convert into an appropriate data type that Hive supports. Is there any way that I can make this convert? Thank you, Pedro Santos
        • 8.7 years ago
          Hi, We are using the latest VM (4.2.1) and we have the same problem. Can you help us fix this problem? Thanking you in advance, --Nuno
          • 8.8 years ago
            The steps are really clear and easy to follow until executing the mapping 'A - Load Movies (Sqoop)'. The execution of this mapping fails with error message: '15/09/10 13:54:07 INFO mapreduce.Job: Task Id : attempt_1441874224765_0007_m_000000_0, Status : FAILED Error: QueryResult : Unsupported major.minor version 52.0'. Unfortunately I haven't been able to fix this problem, which I suspect has something to do with the Java version which is being used. So I'm kind of stuck now... any tips or help is greatly appreciated.
          • 9.2 years ago
            Satisfied.. Thank you for sharing.