Wednesday, 13 February 2013

R Statistical Tool Assignment-5

This class on 12th of Feb was basically conceptual with understanding of concepts like taking a time series data and 

-Finding its returns;
-Conducting a ACF plot to check the stationarity of the Data; 
-Analysing the data through Augmented Dickey Fuller -test. 
-Calculating the historical Volatility and standard deviation of a data set
-standardizing a given data set.

Assignment: 
Create log of returns data  and calculate its historical volatility
Formulae: 
1) logSt-logSt-1/logSt-1
OR
2) log(St-St-1/St-1)
Create ACF Plot for log returns and do the ADF test and analyse on it
Data is as follows:
NSE Index –Jan 2012 –Jan 2013
NIFTY data –Closing prices

Commands:-

> niftychart<-read.csv(file.choose(),header=T)
> closingval<-niftychart$Close
> closingval.ts<-ts(closingval,frequency=252)
> plot(log( closingval.ts))
> minusone.ts<-lag(closingval.ts,K=-1)
> plot(log( minusone.ts))
> z<-log(closingval.ts)-log(minusone.ts)
> z

> returns<-z/log(minusone.ts)
> plot(returns,main="Plot of Log Returns;CNX NSE Nifty Jan-2012 to Jan-2013" )



> acf(returns,main=" The Auto Correlation Plot;   Dotted line shows 95% confidence interval ")


The ACF plot shows that all the correlations lie within our expectations of a 95% confidence interval so there is a fairly good chance of considering the Data to be "STATIONARY"


> adf.test(returns)



 Now with the ADF test and its P-value we can confirm that the Data is "Stationary"

# Now calculating the Historical volatility of the Data

> T<-252^0.5
> histvolatality<-sd(returns)/T
> histvolatality


Tuesday, 5 February 2013

R Statistical Tool Assignment-4

This class on 5th of Feb revolved around reading a particular Data Set. Converting that Data into a time series format and then calculating the returns from it.
Data set- CNX Mid-cap Index downloaded from NSE from August 2012-January 2013-10th reading to 95th reading. 


Commands:-
> z<-read.csv(file.choose(),header=T)
> Close<-z$Close
> Close
> Close.ts<-ts(Close)
> Close.ts<-ts(Close,deltat= 1/252)
z1<-ts(data=Close.ts[10:95],frequency=1,deltat=1/252) 
> z1.ts<-ts(z1)
> z1.ts
> z1.diff<-diff(z1)
> z2<-lag(z1.ts,K=-1)
> Returns<-z1.diff/z2
> plot(Returns,main=" Returns from 10 th to 95th day of NSE Mid-cap Index ")
z3<-cbind(z1.ts,z1.diff,Returns)
> plot(z3,main=" Data from 10th-95th day ; Difference ; Returns")





Assignment:-2

Question: 1-700 data is available, Predict the data from 701-850, use the GLM estimation using LOGIT Analysis for the same

commands

> z<-read.csv(file.choose(),header=T)
> z1<-z[1:700,1:9]
> head(z1)
> z1$ed<-factor(z1$ed)
> z1.est<-glm(default ~ age + ed + employ + address + income + debtinc + creddebt + othedebt, data=z1, family ="binomial")
> summary(z1.est)
> forecast<-z[701:850,1:8]
> forecast$ed<-factor(forecast$ed)
> forecast$probability<-predict(z1.est,newdata=forecast,type="response")
> head(forecast)