Agile operations == DEVOPS

It has been around for some years but when I read about this concept last year I immediately felt comfy: DEVOPS. It’s actually two words and it means the marriage between IT development and IT operations.

Screenshot of windows error
Operations problem

When these departments or people do not communicatie, bad things happen. I have seen this in person and felt it in person. I remember nightly visits to a datacenter in Rotterdam to reboot Unix boxes in the middle of the night (remember Girotel Online dutchies). What actually happens a lot is the existence of a disconnect between what is traditionally considered development work and what is traditionally considered operations work.

To smoothen the bond between DEV and OPS, for me it seems logical, as I started my career in IT systems administration and 15 years ago made a switch to the developer ‘camp’. Since those days I have never understood the difference between development and administration departments within big enterprises. I have worked in both worlds. In my opinion we all try to reach the same end-goal, a happy customer.

Oracle External tables over the network

At one of the projects we do for KPN we use Oracle as a database. For some daily batches we use the external tables mechanism to load data from files we receive from another supplier in the chain.

The files are in the comma separated values (csv) format like below (I used the Oracle example):

56november, 15, 1980 baker mary alice 09/01/2004 87december, 20, 1970 roper lisa marie 01/01/1999

Oracle can represent this file as a table internally and you can subject it to SQL-queries this way. A special driver is used to acces these files and originally could only acces a local filesystem on the server. At the client we used a volume that was mounted from a SAN through fiber channel.Unfortunately we rain into a problem when we wanted to use this mechanism on our production servers. I could reproduce the problem on our development servers using iSCSI (we don’t have fiber channel), that is roughly the same mechanism used by the fiber channel driver.

I created an iSCSI target and a volume and mounted it on v:\mount.

Screen Shot 2015-11-20 at 12.12.12

Screen Shot 2015-11-20 at 12.12.35

As a dba create this directory in sqlplus/your IDE to point to a directory on an iSCSI drive:

CREATE DIRECTORY ext_tab_dir AS ‘v:\test’;
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO SCOTT;

As user SCOTT we create this table:

CREATE TABLE emp_load 2 (employee_number CHAR(5), 3 employee_dob CHAR(20), 4 employee_last_name CHAR(20), 5 employee_first_name CHAR(15), 6 employee_middle_name CHAR(15), 7 employee_hire_date DATE) 8 ORGANIZATION EXTERNAL 9 (TYPE ORACLE_LOADER 10 DEFAULT DIRECTORY ext_tab_dir 11 ACCESS PARAMETERS 12 (RECORDS DELIMITED BY NEWLINE 13 FIELDS (employee_number CHAR(2), 14 employee_dob CHAR(20), 15 employee_last_name CHAR(18), 16 employee_first_name CHAR(11), 17 employee_middle_name CHAR(11), 18 employee_hire_date CHAR(10) date_format DATE mask “mm/dd/yyyy” 19 ) 20 ) 21 LOCATION (‘emp_load.dat’) 22 );

The following error will occur if you try to view data in this table as user SCOTT , for example with this SQL:select * from emp_load;ORA-29913: error in executing ODCIEXTTABLEOPEN callout

ORA-29400: data cartridge error

KUP-04027: file name check failed: V:\test\EMP_LOAD_1488_1176.log

Now to prove that it is the external tables driver that is not working, I will show you that the utl_file API mechanism (another driver to access files through pls/sql) DOES work:

Try this anoymous pl/sql code

declare l_file UTL_FILE.file_type; l_location VARCHAR2(100) := ‘EXT_TAB_DIR’; l_text VARCHAR2(32767); BEGIN – Open file. l_file := utl_file.fopen(l_location, ‘emp_load.dat’, ‘r’, 32767); – Read and output first line. utl_file.get_line(l_file, l_text, 32767); –header record dbms_output.put_line(l_text); utl_file.fclose(l_file); END;

It will print:

56november, 15, 1980 baker mary alice 09/01/2004

So how to make external tables work in this situation? Use a network share pointing to the same location as the fiber channel/ SCSI drive with a mapped network drive (let’s call it this share test). First map the network location to the drive on the windows command prompt of the Oracle server:

net use l: \192.168.50\test

Now change the Oracle directory object to point to the network share containing the file:

CREATE DIRECTORY ext_tab_dir AS ‘\192.168.50.5\tmp\’;
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO SCOTT

Agile methods and technologies for KPN HR-Analytics

This is a translation of the first part of the introduction of my thesis:

Frontpage of thesis
The need for relevant information to distinguish yourselves from others, making you faster than your competitor, has existed since the creation of the first societies. In the course of recorded history more and more data accumulated in different administrations. Since the advent of the computer and the use of databases, there are all kinds of new possibilities to analyze this data and transform it into useful information. Soon the management of large organizations (government and industry) acknowledged the usefulness of these analyses.

The systems and applications for this analysis were called Management Information Systems (MIS). In the last decade, features were added to these systems, which enabled aggregation and different ways to present information. The data is collected in a so-called data warehouse.
Since the expansion of functionality, these systems are known as Business Intelligence (BI) systems. They are indispensable for organizations in business and government to support the management in strategic and tactical issues.

Yet there is growing amount of data stored within organizations and with a tremendous pace. This is due to an ever-increasing external environment in which organizations (are able to) operate. (Aldrich & Mindlin, 1978). Think of blurring boundaries, faster communication, in short: globalization. The processing of this data stream will have to take place more rapidly in order to maintain the competitive advantage (Choo, 1995).

The design and construction of these systems requires usually a lot of time and money. The potential revenue when introducing such systems are high but research shows that there is large gap between investing in a better BI environment and reaping the benefits of it (Williams and Williams, 2007). Unfortunately half to one third of the data warehouse development projects fail. (Hayen et al., 2007) One of the reasons is the complexity of the systems and the data to be processed. There are various techniques involved and the requirements are often unclear. A possible consequence of this failure is that companies have fewer good reporting capabilities. They lack insight into their own activities and are less successful.

The fact that many BI projects fail is very unfortunate because in recent years much progress has been made in the mainstream software development industry through the use of agile development methodologies. These emphasize the continuous delivery of new functionality, intensive communication between customers and the development team and a total focus on quality. The Agile principles can also be applied to BI development projects but in the real world this is not common yet.
Several authors have written on this subject and argue that it is a misconception that agile principles are not applicable to BI (K.Collier, 2011). The best practices that exist and grew out of these principles can be used in modified form.
This thesis examines whether this also applies to KPN’s HR BI environment.

Back to life

A month ago or so I turned on my iMac and something weird happened. The screen stayed black as oil. Reset it and it did actually start to boot, but after a couple of minutes….. bam.. black screen. I could hear the fans still working but the machine died. Did some resets of NVRAM and stuff like that and I was able to boot to the console. Unfortunately, after examining the logfiles I found kernel panics loading the nvidia driver. This pointed to graphics card problems. After some research on the web I found that my iMac’s gpu (Nvidia 8800GS) is known for overheating problems. Must say I never experienced this before, but apparently all of those cards eventually succumb.

After calling Apple support they let me send the iMac to the nearby store for repair. A week later the dreaded answer: ‘ We do not supply this card anymore, so we cannot fix your Mac’.

This was quite a blow, since I really love this hardware. They even wanted me to pay for examining the iMac, but fortunately a Apple Support manager offered reimbursement by Apple.

So what then? I found a company on the internet which made repairing chipsets it’s business. Still, quite a gamble, as it is still about 230 euro’s. But it was my last change to get the iMac working again. Remember, the form factor of this card is so specialised, it is along with an obscure ATI card the only one that will fit into the casing of an iMac attached to the the logic board.

Screen removedScreen removed

Bezel removedBezel removed

LCD offLCD off

Naked imacNaked imac

Logics board with video card attachedLogic board + videocard

The defect cardThe defect card

- - - - - -



















So I had to disassemble the entire iMac to get the graphics board out of it. This was quite an endeavour, because Apple used some special Torque screws and clever engineering. But I succeeded and created a package with the videocard to be sent to the UK.

After a week or so it returned, supposedly fitted with a brand new GPU (graphics processing unit). A bit nervous and hopeful, I reassembled the iMac.

Reassemble!Reassemble!

Booting...Booting…

My old desktopMy old desktop!

It works!! Thanks Haytek LTD!!

Go supersecure

At Cloud Seven we endorse the super secure G/On product. It replaces the VPN technology that is commonly used to deliver remote access to the company network to employees. The easy part is you just plug in the USB stick and boot your computer and login, choose your application and it starts automatically and connects to your company’s servers and networks!

gon stick
G/ON USB stick

This product was acquired by Excitor in 2012, the creators of DME and integrated G/On in their mobile app as the Appbox feature. (Mobile Secure Email synchronization and remote Secure Apps).
It was recently selected to secure English councils in the U.K.

It is very easy to implement, uses industry grade encryption for connections and links up to your Active Directory or LDAP server for authentication.

Developing using Webstorm+Docker+nodejs on OS X and debugging it

First install the latest Docker(I use the Mac version for this tutorial) here: http://docs.docker.com/mac/step_one/

Get Webstorm: https://www.jetbrains.com/webstorm/
Start the Docker Quick start terminal (see screenshot) You’ll start the default docker vm (linux inside OS X).You can find this later by issuing ‘docker-machine ls’:

![] docker-machine ls

Go and download an image inside the docker terminal:Use ‘docker pull ubuntu’

docker pull  ubuntu

To run an interactive shell in the Ubuntu image: use ‘docker run -i -t ubuntu /bin/bash’

docker run -i -t ubuntu /bin/bash

Of course we also want to mount our home directory: e.g. use ‘docker run -it -v $HOME:/mnt ubuntu’ Inside the docker host you can find out that your mount is there: ‘ls /mnt’

ls /mnt

Go and create a folder on the host system that is going to have the app that you want to install for example /Users/thajoust/dev.Create a package.json file by issuing a : ‘npm init’ on the host system in $HOME/dev or create a empty one with: ‘touch $HOME/dev/package.json’ on the host system shell.

touch $HOME/dev/package.json Now if you go to the docker terminal and issue the command ‘ls /mnt/dev/’ you will see the package.son file ls /mnt/dev/ You can exit the container by issuing ‘exit’ in the container shell. You now need to run the docker image from the docker terminal to mount the new directory directly : ‘docker run -it -v $HOME/dev:/testApp -w /testApp ubuntu’

Now you will be in the container and you will be able to see the package.son. You can safely delete it now.docker run -it -v $HOME/dev:/testApp -w /testApp ubuntu

Next up is installing node in the container: First update the package state: ‘apt-get update’ then: ‘apt-get install nodejs’

apt-get install nodejs

Install npm: ‘apt-get install npm’ ![Screen Shot 2015-09-16 at 14.25.19]

apt-get install npm

Install Node locally: ‘npm install n -g’

npm install n -g

First install wget:‘apt-get install wget’

apt-get install wgetthen:Install latest nodejs:‘n latest’n latest

Install express generator:‘npm install -g express-generator’

npm install -g express-generator

Inside your testApp folder you can create an express App:‘express my app’

express myapp

Now we have to introduce a quick workaround for the problem that on some platforms there will be problems with symlinks,.
echo “bin-links=false” >>$HOME/.npmrc

There is now a folder with the name my app, let’s continue:‘cd myapp’ followed by ‘npm install’ (will install all the modules required for running your app) Then it is time to start the app:‘npm start’ Now go to webstorm and open the folder my app

npm startYou will see all the files.Configure the Run/Debug Configuration:Create a Node.js remote Debug using the + on left side cornerHost: 192.168.99.100 Port: 5858Create a Node.js remote DebugExit the docker container and commit it:‘exit’‘docker ps -l’ docker ps -l‘docker commit -m “message” containerId [newNameForTheImage]:[tag]’docker commit -m “message” containerId [newNameForTheImage]:[tag]Now we need to be able to open the ports that the node app is listening (both the app and the debugger):‘docker run -it -v $HOME/dev:/testApp -w /testApp/myapp -p 3000:3000 -p 5858:5858
ubuntu:version1’ docker run -it -v $HOME/dev:/testApp -w /testApp/myapp  -p 3000:3000 -p 5858:5858 ubuntu:version1Run node express app in debug mode:‘node –debug ./bin/www’node –debug ./bin/www

If you go to Webstorm and run the app with your Debug Configuration you will see that all is working.Below is the breakpoint reached when i set it and go to http://192.168.99.100:3000

breakpoint reached

Internet sharing on a Mac

I use a Mitel desk phone which connects to my company’s PBX when working from home. I used to hook it up to my iMac and used internet sharing to make it connect. I tweaked the default network of the internal DHCP server used (to avoid a clash with the 2.0 network at my company) by adding a ‘SharingNetwork’ key to the NAT configuration in /Library/Preferences/SystemConfiguration/com.apple.nat

As of Yosemite this ceased to work and I used an extra wire to my switch to connect the Mitel.

I was certain it should work so I spent some time on google and found the solution http://hints.macworld.com/article.php?story=20090510120814850:

- - - - - -

Bash |
copy code |
?

1sudo defaults write/Library/Preferences/SystemConfiguration/com.apple.nat NAT -dict-add SharingNetworkNumberStart 192.168.20.0

2sudo defaults write/Library/Preferences/SystemConfiguration/com.apple.nat NAT -dict-add SharingNetworkNumberEnd 192.168.20.20

3sudo defaults write/Library/Preferences/SystemConfiguration/com.apple.nat NAT -dict-add SharingNetworkMask 255.255.255.0

Up to speed

Finally an update on my blog for you guys. So let me get you current on developments:

  • I am now officially a Bachelor of ICT!

  • I’m presenting the conclusion from the thesis at KPN soon

  • the code I used for my final assignment is downloadable at https://gitlab.cloudseven.nl:8081/joost/horang

  • visited XebiCon 2015 in May and heard lots of interesting stuff on Agile development

  • visited the Microsoft Techdays and learned lots about Javascript, React and Cloud developments

  • visited Paris to attend React Europe 2015 and learned tons of stuff on React and javascript