Big Data Discovery configuration in Oracle Big Data Lite VM 4.2.1

Christos - Iraklis TsatsoulisBig Data, Oracle Big Data Discovery 1 Comment

The latest version (4.2.1) of Oracle Big Data Lite VM, among many additions, now includes also the much-expected Oracle Big Data Discovery (v. 1.1), which I had not played with so far (it is a new product); so I thought to take it for a ride. Since my test data included geolocation attributes (latitude/longitude), one of the first things I …

Ansible playbook to provision a WebLogic Fusion Middleware Domain on RHEL 7

Chris VezalisAnsible, DEVOPS, Fusion Middleware, WebLogic

An Ansible Playbook for installing and configuring a WebLogic 12c server with Oracle Fusion Middleware 12.1.3 software in Redhat Linux 7 (RHEL/CentOS/Oracle Linux) system. This playbook is for version 12.1.3 of WebLogic and Fusion Middleware Infrastructure software. You can download the code here: https://github.com/cvezalis/weblogic-ansible Prerequisites for running the playbook – Configure your environment variables in infra-vars.yml. This file contains variables …

Dataframes from CSV files in Spark 1.5: automatic schema extraction, neat summary statistics, & elementary data exploration

Christos - Iraklis TsatsoulisBig Data, Spark 25 Comments

In a previous post, we glimpsed briefly at creating and manipulating Spark dataframes from CSV files. In the couple of months since, Spark has already gone from version 1.3.0 to 1.5, with more than 100 built-in functions introduced in Spark 1.5 alone; so, we thought it is a good time for revisiting the subject, this time also utilizing the external …

Reusing ADF Business Components to develop web services

Michael KoniotakisDEVOPS, Oracle ADF Leave a Comment

There are already numerous ADF applications that have been developed these latest years, mostly using Business Components in the model layer. With the increased need of SOA processes and mobile applications based on web services, there is a big and growing need of exposing the same functionality via web services. According to the documentation  “the same application module can support …

Dynamic Dashboard

Konstantinos ChatzisOracle ADF Leave a Comment

While I was checking the new Oracle Alta UI Demo (http://jdevadf.oracle.com/workbetter/faces/index.jsf) I noticed a dashboard view with all employees. The panelDashboard component of the Oracle Application Development Framework (Oracle ADF) Faces feature is a JavaServer Faces (JSF) layout container that Oracle ADF developers can use to implement such information dashboards. I decided to use it in a project that we …

Maven replace tokens inside files before application build using profiles

Chris VezalisDEVOPS, Maven, Oracle ADF

When building projects on multiple environments (for example a developer test server, a system test server, a user acceptance test server, a training server and finally a production server), application need to be passed environment variables. One way to solve this is to use tokens on files that Maven will replace will real values found on separate property files for …

Creating Data Models in ODI 12c with Common Format Designer (CFD)

Gabriel SiderasData Integration Leave a Comment

In this post I will try to outline the capabilities of the Common Format Designer (CFD) which can be used in Creating Data Models in ODI 12c. CFD can be used to quickly design a data model, forward engineer a database schema and generate a default data flow between data models. Oracle Data Integrator (ODI) is Oracle’s flagship product on data integration. In 12c a …

Development and deployment of Spark applications with Scala, Eclipse, and sbt – Part 2: A Recommender System

Constantinos VoglisBig Data, Spark 11 Comments

In our previous post, we demonstrated how to setup the necessary software components, so that we can develop and deploy Spark applications with Scala, Eclipse, and sbt. We also included the example of a simple application. In this post, we are taking this demonstration one step further. We discuss a more serious application of a recommender system and present the …

Development and deployment of Spark applications with Scala, Eclipse, and sbt – Part 1: Installation & configuration

Constantinos VoglisBig Data, Spark 23 Comments

The purpose of this tutorial is to setup the necessary environment for development and deployment of Spark applications with Scala. Specifically, we are going to use the Eclipse IDE for development of applications and deploy them with spark-submit. The glue that ties everything together is the sbt interactive build tool. The sbt tool provides plugins used to: Create an Eclipse …