• Tame Big Data using Oracle Data Integration

    This demo illustrates how you can move and transform all your data using Oracle Data Integration - whether that data resides in Oracle Database, Hadoop, third-party databases, applications, files, or a combination of these sources. The "Design once, run anywhere" paradigm allows you to focus on the logical rules of data transformation and movement while the choice of implementation is separate and can evolve as use cases change and new technologies become available.
    The demo highlights Oracle Data Integrator and Oracle GoldenGate integrating with Oracle Loader for Hadoop, Oracle Big Data SQL, Hive, Sqoop, Flume, Pig, HDFS.

    Duration: 60 minutes

    This content is intended for the following job role(s): Data Scientist, Database Designer, Data Warehouse Developer, Data Warehouse Administrator

    Release Date: 25-SEP-2014

Reviews (6)

  • 3.1 years ago
    thomas.vengal
    If you are using Avro target format, then it does not accept ‘$’ characters in the object name. So you need to manually configure the following in the properties file to remove the ‘$’ from the object name. Here is an example: gg.schemareplaceregex=(.+)\\$ gg.schemareplacestring=$1
    • 3.3 years ago
      trung.phan
      I tried to run thru step Ingest Change Data using Oracle GoldenGate for Big Data in Part 1 - Ingest Change Data using Oracle GoldenGate for Big Data but seems it failed to capture the new records...please have any advice for me .. Thanks,
      • 4.1 years ago
        pedro.santos
        Hello experts, I'm trying to load an oracle table into an hive table, using LKM SQL to Hive Sqoop. The problem is that I'm getting an error when this Oracle table is written into Hive because one of the columns is in RAW data type and ODI/LKM doesn't make a convert into an appropriate data type that Hive supports. Is there any way that I can make this convert? Thank you, Pedro Santos
        • 4.2 years ago
          nuno.almeida
          Hi, We are using the latest VM (4.2.1) and we have the same problem. Can you help us fix this problem? Thanking you in advance, --Nuno
          • 4.2 years ago
            e.vanderweijden
            The steps are really clear and easy to follow until executing the mapping 'A - Load Movies (Sqoop)'. The execution of this mapping fails with error message: '15/09/10 13:54:07 INFO mapreduce.Job: Task Id : attempt_1441874224765_0007_m_000000_0, Status : FAILED Error: QueryResult : Unsupported major.minor version 52.0'. Unfortunately I haven't been able to fix this problem, which I suspect has something to do with the Java version which is being used. So I'm kind of stuck now... any tips or help is greatly appreciated.
          • 4.7 years ago
            pandey.lalit24
            Satisfied.. Thank you for sharing.

            Buttons