Short tutorial videos for JMP presented by Julian Parris. Fit Model JSL Predictive Modelling Post navigation That last statistic is important because I also want to generalise the method for some forward selection calculations that involve more than one X variable in the model. The matrix calculations are 5 times faster than Bivariate and over 30 times faster than Fit Model. How do these methods compare in terms of performance?īelow is a chart of execution times for each method: Correlation JMP Download All Guides Correlation Visualize the relationship between two continuous variables and quantify the linear association via. And then of course there is the Fit Model platform. In fact, if my only goal is the calculation of R 2 then I could use the Multivariate platform. Of course it’s possible to perform regression in JMP using the Bivariate, and in JSL this is how I would extract the R 2 value: biv = Bivariate( That requires a bit more care to handle situations such as missing data or singular values. In practice I want to perform this for 100’s of variables based on real-world data. A matrix of intercorrelations between two or more columns of the data. R2 = (β`*X`*Y - N*Ybar^2)/(Y`*Y - N*Ybar^2) A measure of the influence or leverage of a row in the data table. Now I have my solution I can use it to compute the R 2 statistic: N = NRows(Y) add a column of 1's for the intercept term I of course need the matrices, so here is the full code: // generate matrices There’s a direct correspondence between the mathematical form and the code – no need to figure out complex algorithms to convert the problem into JSL. That’s it! One line of code to compute the parameter estimates (β) for a set of X and Y data. Where the grave accent indicates the transpose of the X matrix. One of the great things about JSL is that I can directly implement this formula: β = Inv(X`*X)*X`*Y A positive value indicates a positive relationship (higher values of one variable. For each pair of variables, a Pearson’s r value indicates the strength and direction of the relationship between those two variables. Linear regression in matrix form looks like this: Correlation matrices are a way to examine linear relationships between two or more continuous variables. Each time I do the calculations I need to go and have an extended coffee break – and I’m starting to buzz with too much caffeine so I thought I would look to see whether I could make my code more efficient! I’m working on some predictive modelling projects and I need to iteratively compute R 2 statistics over 100’s of variables. Learn how to examine relationships visually using Distribution and Graph Builder, use the JMP Multivariate platform to create correlation statistics, and use.
0 Comments
Leave a Reply. |