PDF Archive

Easily share your PDF documents with your contacts, on the Web and Social Networks.

Share a file Manage my documents Convert Recover PDF Search Help Contact



4I20 IJAET0520830 v7 iss2 327 333 .pdf



Original filename: 4I20-IJAET0520830_v7_iss2_327-333.pdf
Author: ijaet

This PDF 1.5 document has been generated by Microsoft® Word 2013, and has been sent on pdf-archive.com on 04/07/2014 at 07:56, from IP address 117.211.x.x. The current document download page has been viewed 391 times.
File size: 680 KB (7 pages).
Privacy: public file




Download original PDF file









Document preview


International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963

SIGNIFICANT FACTORS TO AFFECT THE BLOOD PRESSURE
Sameera Abdulsalam Othman1, Shelan Saied Ismaeel2, Kurdistan M.Tahar Omer3
1

Department of Mathematics, Faculty of Educational Science,
School of Basic Education, University of Duhok, Iraq.
2.3
Department of Mathematics, faculty of science, University of ZAKHO, Iraq

ABSTRACT
The stepwise regression and other method to know the best method when the model contains intercept and
without intercept, and applying the method leverage point when we added the new point to the original data. We
testing the significant intercept by using (F,AIC and Cp) test.

KEYWORDS: stepwise, hypothesis, significant, original, regression.

I.

INTRODUCTION

Multiple Linear regression (MLR) is a method used to model the linear relationship between a
dependent variable and one or more independent variables. The dependent variable is sometimes also
called Predict and, and the independent variables the predictors. The aim of this paper to use stepwise
regression and other method to know the best method when the model contains intercept and without
intercept, and contain the important definition of the regression and the most important relationship
and the equation that are used to solve example about the Multiple linear regression of least squares
and estimation and test of hypothesis due to the parameters, and so the most important application the
theoretical of blood pressure (dependent variable Y) and height, weight, age sugar, sex, hereditary
factor social status (independent variable) .

II.

LINEAR REGRESSION MODELS AND ITS TYPES

a-linear regression model with intercept:
The linear regression be intercept if the line regression intersection with Y axis in not origin.
It means that mathematicallyB0 ≠ 0 that is intersection point of regression line with Y axis
Yi = B0 + B1 Xi1 + ei , i = 1,2,3, ⋯ , n
(1)
Yi = depended variyabl
X i1 = independent variable
B0 , B1 = regression parameter
ei = value of random error
b-linear Regression Model Without Intercept
[4],[5]
The linear regression be without intercept when the line regression to pass through the
origin. It means that mathematically B0 = 0
We can write the simple linear regression model
Yi = B1 Xi1 + ei
(2)
The parameters B0 and B1 usually anknown and estimate by least squars method. From(3,4)
S
̂1 = xy
B
(3)
Sxx
̂0 = Y
̅−B
̂1 X
̅
B
(4)

327

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963
Where
Sxy: standard deviation between X and Y
Sx : standard deviation of X
But linear regression without intercept
n
Xi Yi
̂1 = ∑i=1
B
n
2
∑i=1 xi

III.

(5)

LEVERAGE POINT AND REGRESSION THROUGH THE ORIGIN LEVERAGE
POINT

The interpretation can be use in understanding the difference between the full fit and for forced
through the origin. It is show that the regression through the origin is equivalent to fitting the full
model .To a new data set. This nee data a set is composed of the original observation. Evaluation of
the leverage possessed by this new points is equivalent to evaluating when the B0 = 0 in the full
model. [2]

IV.

AUGMENTING THE DATA SET

Let (X1 , Y1 ), (X 2 , Y2 ), ⋯ , (X n , Yn ) be n data points observed according to
Yi = b0 +B1 Xi +∈i
Where the experimental errors, ∈i , are independently normally distributed with mean 0 and
σ2 . the least squares estimates for b0 and b1 are b0
∑ni=1(xi − x̅)(yi − y̅)
̂1 ̅
b̂1 =
,
b̂0 = ̅
Y−B
X
∑ni=1(xi − x̅)2
∑X
∑Y
̅
Where ̅
X= i
,
Y= i
[2],[7]
n

n

If the regression is forced through the origin, then it is assumed that the data are observed
according to
Yi = B1 X i + ei and the least squares estimate of B1 in this model is
∑ni=1 xi yi
̂
B1 = n 2
∑i=1 xi
If the original data set is augmented with anew observation
̅)
(X n+1 , Yn+1 ) = (n∗ X̅ ,
n∗ Y
n

wheren =
(6)
√n+1−1

Then fitting the full model to the augmented data set is equivalent to forcing the original
regression through the origin. This follows from the easily verified identities
n+1

n

̅ n+1 )(Yi − Y
̅n+1 ) = ∑ X i Yi
∑(X i − X
i=1
n+1

n

∑(X i − ̅
X n+1 )2 = ∑ X i2
i=1

i=1

i=1

,

n+1

n

∑(Yi − ̅
Yn+1 )2 = ∑ Yi2
i=1

i=1

Where

∑n+1
∑n+1 Y
i=1 X i
̅n+1 = i=1 i
,
Y
n+1
n+1
The position of the point(n∗ X̅ , n∗ ̅
Y) , relative to the other points, determines the new points
have high or low leverage. The leverage of the new point can be used to decide if the
regression through the origin is more a pirate intercept term.[6]
̅ n+1 =
X

V.

THE LEVERAGE

There is also a straight forward generalization to multiple linear regression the original data
is augmented whit the point (n∗ X̅ , n∗ ̅
Y) where X̅ is a vector containing the means of the
independentvariables. The impact of the augmented data point on the fit because cleared

328

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963
when hn+1 is also examined, it is, perhaps, more instructive to write hn+1In the equivalent
form.
1

̅2
X

hn+1 = n+1 [1 + n [σ2 +X̅2 ]]
x

(7)

̅ )2
∑(X −X

i
Where
σ2x =
(8)
n
̅⁄σx )2 , and we can expect
Thus the impact of the augmented data point increases with (X
̅ is large
the greatest discrepancy between the full fit and through the origin when X
2
compared to σx . The augmented data set will seem to be composed of two distinct cluster ;
one composed of the original data and one composed of (n∗ X̅ , n∗ ̅
Y) .[3][9]

VI.

STEPWISE REGRESSION AND OTHER METHODS

Stepwise regression is an appropriate analysis when you have many variables and you’re interested in
identifying a useful subset of the predictors. In Minitab, the standard stepwise regression procedure
both adds and removes predictors one at a time. Minitab stops when all variables not included in the
model have p-values that are greater than a specified Alpha-to-Enter value and when all variables that
are in the model have p-values that are less than or equal to a specified Alpha-to-Remove value .In
addition to the standard stepwise method, Minitab offers two other types of stepwise
procedures[1],[10]
i-Forward selection
a- Start with the most significant variable
b- Given that the first variable is in the model, enter the
c- variable that gives the largest contribution to the model
d-Continue and add all significant variables
ii-Backward elimination
a- Start with a model including all variables
b- Remove the most insignificant variable
c- Continue until all remaining variables in the model are significant

VII.

ANOVA TABLE

For a one-way ANOVA, the table looks like:
Source
df
SS
MS F
Treatment (k−1)
SST MST MST/MSE
Error (N−k)
SSE MSE
Total N−1
SST

VIII.

p-value
p
[8],[5]

TESTING OF HYPOTHESES

These three criteria are somewhat related to one another.
Pvalue=
SSE
AIC=n ∗ ln(σ
̂2 ) + 2(k + 1)
and
̂2 =
σ
(9)
n
C.L. Mallows suggested a statistic as:
RSSp
Cp = σ̂2 – (n-2p)
(10)
[11]

IX.

APPLICATION

This section contains an application of what the theoretic part mentioned in the section two. Data are
obtained for the practical application of these samples from healthy centers. We take the data in the
study of blood pressure (dependent variable Y) and height(X1), weight(X2), age(X3)sugar(X4),
sex(X5,0 denote to male, and 1to female), hereditary factor) X6,0 if have hereditary factor,1 if
no),social status(X7 , 0 if married and 1 if single) (independent variable) for (100) person . we can
show that in table (1)

329

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963
Table (1):blood pressure and height, weight, age ,sugar ,sex, hereditary factor for (100)person
No.

Y

X1

X2

X3

X4

X5

X6

X7

No.

Y

X1

X2

X3

X4

X5

X6

X7

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
No.
77
78
79
80
81
82
83
84
85
86
87
88

130
140
140
130
150
190
110
80
120
140
140
130
170
80
240
100
180
160
120
100
140
230
70
190
210
200
240
250
117
230
145
145
190
190
160
170
160
175
Y
180
125
170
180
180
190
180
200
90
220
220
180

145
152
158
172
161
160
159
152
165
157
161
165
160
154
151
180
167
154
172
152
162
174
163
155
151
156
151
172
154
174
155
150
156
160
160
158
157
160
X1
157
172
171
159
169
155
155
160
169
150
148
157

41
79
83
83
67
72
48
75
54
69
43
83
110
45
54
69
91
70
59
53
43
72
48
92
80
87
54
90
55
72
69
65
93
72
64
55
70
83
X2
80
65
80
70
60
80
50
64
54
50
67
55

24
49
34
48
40
46
21
44
46
31
45
48
41
27
50
32
40
100
55
62
80
60
25
70
50
45
50
65
28
60
33
35
45
46
50
55
54
44
X3
50
42
59
48
54
70
50
50
38
65
55
60

178
100
270
223
82
160
170
82
147
150
117
165
335
90
314
151
190
264
133
90
100
82
270
350
250
234
310
126
260
245
130
144
249
320
227
179
234
230
X4
293
259
360
415
265
159
188
173
231
276
296
169

1
1
0
1
1
0
0
1
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
X5
0
0
0
0
0
0
0
0
0
1
0
0

0
1
1
1
1
0
0
1
1
0
0
0
1
1
0
1
1
1
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
0
1
0
0
X6
1
1
1
1
1
1
1
1
0
1
1
1

0
0
1
0
0
1
0
1
1
0
0
0
1
1
1
1
1
1
0
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
0
X7
1
0
0
1
0
1
1
0
0
0
1
1

39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
No.
89
90
91
92
93
94
95
96
97
98
99
100

150
180
180
170
150
160
150
118
80
170
180
150
80
70
210
240
140
140
170
180
80
210
200
130
200
170
80
180
80
150
170
220
170
170
230
180
200
180
Y
170
200
160
180
250
210
117
150
160
170
90
180

168
176
159
156
149
154
154
168
152
160
160
162
153
163
171
151
157
161
164
152
168
153
156
172
166
169
150
166
150
161
173
171
156
167
174
152
171
164
X1
158
160
172
154
172
151
154
172
160
160
154
152

111
81
67
70
60
70
69
76
75
64
60
68
45
48
116
54
69
43
60
61
67
35
87
83
90
85
45
69
55
54
79
82
53
70
72
45
60
60
X2
65
65
70
65
90
80
55
96
70
110
50
85

50
60
50
47
45
60
40
60
44
45
32
40
27
25
40
50
31
45
54
45
37
58
45
48
40
27
28
41
25
40
59
60
16
55
60
60
43
60
X3
60
52
70
52
65
50
28
35
69
41
25
60

300
420
255
289
176
156
195
188
173
143
128
342
370
339
310
255
295
244
280
296
320
195
190
210
216
269
270
378
399
369
315
345
214
195
377
430
377
384
X4
258
432
170
185
265
250
128
235
170
174
175
280

0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
1
0
1
0
0
0
0
0
0
0
0
0
X5
0
0
0
0
0
0
0
0
0
0
1
0

0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
0
0
1
1
0
1
0
1
1
1
1
1
1
1
1
1
0
1
0
1
0
X6
1
0
1
1
1
1
1
1
1
1
1
0

1
0
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
0
1
1
0
0
1
0
0
1
0
0
X7
1
1
0
1
0
1
1
0
0
1
1
1

by using software of Minitab (13.2). we compare between linear Regression model with intercept and
without intercept by using stepwise regression and other methods.

330

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963

X.

THE STATISTICAL ANALYSIS

We get the linear regression model by using analysis data
The regression equation is
y = 214 - 1.08 X1+0.631 X2+1.39 X3+ 0.0839 X4- 34.1 X5+ 4.93 X6- 11.2 X7
Here is the printout from a model of Y vs. X1, X2, X3, X4, and X5.
There is low significance F, but 3of the p-values are high.
table(2): Analysis of Variance by using one-way ANOVA table
Source
Regression
Residual Error
Total
Predictor
Constant
height
weight
age
suger
sex
heredita
social s

DF
7
92
99

SS
76753
113278
190031

Coef
214.36
-1.0754
0.6310
1.3869
0.08386
-34.14
4.926
-11.233

MS
10965
1231

SE Coef
90.18
0.5698
0.2398
0.2647
0.04030
10.71
8.685
9.087

Fc
8.91

P
0.000

T
2.38
-1.89
2.63
5.24
2.08
-3.19
0.57
-1.24

Ft
3.27

F significant
2.46E-08

P
0.020
0.062}
0.010
0.000
0.040
0.002
0.572
0.220

Stepwise Regression: y versus height, weight, age ,sugar,sex, hereditary factor
Alpha-to-Enter: 0.05
Response is

y

Alpha-to-Remove: 0.05
on

7 predictors, with N =

Step
Constant

1
88.52

2
97.05

3
64.53

age
T-Value
P-Value

1.55
5.56
0.000

1.48
5.54
0.000

1.38
5.24
0.000

-35
-3.34
0.001

-31
-2.98
0.004

sex
T-Value
P-Value
weight
T-Value
P-Value
S
R-Sq
R-Sq(adj)
C-p

0.53
2.37
0.020
38.4
24.00
23.22
21.3

36.5
31.86
30.45
11.2

35.7
35.62
33.61
7.4

Now we have the AIC values for each model:
Blood Pressure and age, weight & sex
Blood Pressure and age, height & sex
Blood Pressure and age ,sex & sugar
Blood Pressure and age , sex & hereditary factor
Blood Pressure and age , sex & social status

331

100

AIC
718.9397
724.3908
720.4059
724.2142
724.5548

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963
The final step is to determine which value is smallest. In this case the model with these variables has
the smallest AIC value. Therefore we determine that this model is the best fit given our data.
y = 64.5 + 0.532 X2 + 1.38 X3 - 31.2 X5
the point (n*x,n*y) is added to the original data we know if the new point effect the imported of
the intercept in original data as:
𝑛∗ = 11.049,𝑛∗ 𝑌̅=47.7871,𝑛∗ ̅
X1 = 1772.149, n∗ ̅
X 2 = 756.414,n∗ ̅
X 3 = 521.291
∗̅
∗̅
∗̅
̅ 7 = 7.403
n X 4 = 2625.684,n X 5 = 1.546,n X 6 = 8.287,n∗ X
We calculate leverage point by equation (7)
hn+1 =h101= 0.836
When we added the new compound to original data , the data become (n=101)
We get the linear regression model by using analysis data
The regression equation is
y = 0.36+ 0.238X1+ 0.526X2+ 1.45X3+ 0.0850X4- 27.7X5+ 2.40X6 + 1.10X7
Table (3): Analysis of Variance
Source
Regression
Residual Error
Total
Predictor
Constant
height
weight
age
suger
sex
heredita
social s

DF
7
93
100
Coef
214.36
-1.0754
0.6310
1.3869
0.08386
-34.14
4.926
-11.233

SS
2687881
120235
2808116

MS
383983
1293

SE Coef
90.18
0.5698
0.2398
0.2647
0.04030
10.71
8.685
9.087

T
2.38
-1.89
2.63
5.24
2.08
-3.19
0.57
-1.24

F
297.01

P
0.000

P
0.020
0.062
0.010
0.000
0.040
0.002
0.572
0.220

Stepwise Regression: y versus height, weight, age ,sugar,sex, hereditary factor
Alpha-to-Enter: 0.05
Response is

y

Alpha-to-Remove: 0.05

on

7 predictors, with N =

101

Step
Constant

1
0.2421

2
0.2203

3
0.3340

4
0.3941

5
0.3739

height
T-Value
P-Value

1.008
36.84
0.000

0.550
6.32
0.000

0.592
7.00
0.000

0.368
2.93
0.004

0.251
1.85
0.067

1.56
5.48
0.000

1.51
5.53
0.000

1.44
5.36
0.000

1.44
5.44
0.000

1.68
7.20
0.000

-33
-3.06
0.003

-29
-2.75
0.007

-27
-2.64
0.010

-23
-2.25
0.027

0.56
2.36
0.020

0.54
2.32
0.023

0.83
4.71
0.000

0.085
2.08
0.040

0.116
3.10
0.003

35.6
95.71
95.49
4.1

36.0
95.56
95.37
5.5

age
T-Value
P-Value
sex
T-Value
P-Value
weight
T-Value
P-Value
suger
T-Value
P-Value
S
R-Sq
R-Sq(adj)
C-p

43.9
93.20
93.13
50.7

38.6
94.80
94.69
18.0

37.1
95.26
95.11
10.0

36.2
95.52
95.33
6.4

Now we have the AIC values for each model:

332

6
1.5727

AIC

Vol. 7, Issue 2, pp. 327-333

International Journal of Advances in Engineering & Technology, May, 2014.
©IJAET
ISSN: 22311963
Blood Pressure and age, weight, sex, height
729.9308
Blood Pressure and age, weight, sex, sugar
729.0023
Blood Pressure and age, weight, sex, hereditary factor
737.8515
Blood Pressure and age, weight, sex, social status
737.5767
The final step is to determine which value is smallest. In this case the model with these variables has
the smallest AIC value. Therefore we determine that this model is the best fit given our data. The
regression equation is
y = 1.57 + 0.832 X2 + 1.68 X3 + 0.116 X4 - 23.0 X5

XI.

CONCLUSION

When Compared between models with and without intercept, we show that the best model
(P,AIC)when we add the new leverage point is the baste than the model without intercept(P,AIC)
because when we add leverage point(B0=0) we note that another factor(sugar)is affect to blood
pressure it is medical real.

REFERENCE
[1 ] Bo Wang, (2011), "Variable Ranking by Solution-path Algorithms", Canada.
[2]Betty R.Kirkwood,(1988),Medical statistics,www.blackwell-science.com
[3]Casella, G.,(1983) .Leverage and Regression Through the Origin, the American statistician,
vol.37,No.2,PP(147-152).
[4] Cohen, J., Cohen, P., West, S. G., & Aiken, L. S.: “Applied multiple regression/correlation analysis for the
behavioural sciences”, 3rd Edition, Mahwah, NJ: Lawrence Erlbaum Associates, 2003.
[5] D.R. Helsel and R.M. Hirsch(1991), Techniques of Water-Resources Investigations of the United States
Geological Survey Book 4, Hydrologic Analysis and Interpretation Reston, VA USA.
[6] DAVID L. FARNSWORTH,(1990). the effect of a single point on correlation and slope, Internat. J. Math. &
Math. Scio. VOL. 13 NO. 4,pp(799-806).
[7] Gujarati (2004)," Basic Econometrics" ,The McGraw−Hill Companies, Fourth Edition
[8] Helsel, D.R. and R. M. Hirsch, 2002. Statistical Methods in Water Resources Techniques of Water
Resources Investigations, Book 4, chapter A3. U.S. Geological Survey. 168 pages.
[9] John O. Rawlings ,Maneesha &Priti Bajpai(2013), Multiple Regression Analysis Using ANCOVA in
University Model, Vol. 3, No. 5, International Journal of Applied Physics and Mathematics.
[10]R. A. Johnson, Probability and Statistics for Engineers, 6th ed, Pearson Education, 2003, ch.12, pp. 428432.
[11]Statistics Group of the I.O.M.S. Department (2004)," Multiple linear regression in Minitab".

AUTHORS BIOGRAPHY
Sameera Abdulsalam Othman is associated with Department of Mathematics, School of
Basic Education / University of Duhok , Iraq. She has completed B.Sc. in Mathematical
Science & M.Sc. in Statistics.

Shelan Saied Ismaeel is working in Department of Mathematics, Faculty of Science /
University of Zakho, Iraq. She has completed B.Sc. in Mathematical Science & M.Sc. in
Statistics.

Kurdistan M.Tahar Omer is working in Department of Mathematics, Faculty of Science /
University of Zakho, Iraq. She has completed her B.Sc. in Mathematical Science & M.Sc. in
Statistics

333

Vol. 7, Issue 2, pp. 327-333


Related documents


4i20 ijaet0520830 v7 iss2 327 333
how to interpret regression analysis results
10 1183 13993003 02368 2017
chapter 8
lecture1 pdf
assignment one nabila binte zahur


Related keywords