DOC PREVIEW
UT Dallas CS 6375 - VC-Example2-solutions

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

VC-Example2-solutionsQuestion 1Consider the case where the examples to a learning task are given as pairs of (not necessarily Boolean)numbers (x1, x2), labeled as positive or negative.Part AIt is known that all positive examples and none of the negative examples satisfy:x1= a, AND 0 ≤ x2≤ bHere are a few training examples for the case in which a = 3, b = 1:x1x2label3 1 positive3 0.5 positive3 0 positive0 0.5 negative3 2 negative5 6 negativeA1Select the most appropriate learning algorithm for this task among the following choices:1. ID3.2. A perceptron implemented with a sigmoid unit, with the following functions as input: φ1= 1, φ2= x1,φ3= x2.3. A neural network with the following functions as input: φ1= 1, φ2= x1, φ3= x2, with one hidden layerand with as many hidden layer nodes as needed.4. A neural network with the following functions as input: φ1= 1, φ2= x1, φ3= x2, with two hidden layersand with as many hidden layer nodes as needed.5. Naive Bayesian.6. Nearest Neighbor.Answer:1 / 2 / 3/ 4 / 5 / 6The most appropriate answer is 3.A2Your answers to this part should not depend on your answer to A1. Assume that a learning algorithmcapable of producing a hypothesis consistent with all training examples is available. In each of the followingcases compute how many randomly chosen training examples are needed to guarantee with confidence of atleast 90% that at least 95% of randomly selected test examples are answered correctly. Specify the formulayou use for the computation, and what is the value of each of the variables in the formula.1. the value of a is one of the following: 1, 1.5, 2, 2.5, 3, 3.5.The value of b is one of the following: 1, 1.5, 2, 2.5, 3, 3.5.Answer: The number of training examples should be at least: 118.The formula used:m ≥1ln(r/δ)The variables in the formula have the values:r = 36,  = 0.05, δ = 0.1.2. a, b are real numbers.Answer: The number of training examples should be at least: 1630.The formula used:m ≥1(4 log2(2/δ) + 8d log2(13/))The variables in the formula have the values:d = 1,  = 0.05, δ =


View Full Document

UT Dallas CS 6375 - VC-Example2-solutions

Documents in this Course
ensemble

ensemble

17 pages

em

em

17 pages

dtree

dtree

41 pages

cv

cv

9 pages

bayes

bayes

19 pages

vc

vc

24 pages

svm-2

svm-2

16 pages

svm-1

svm-1

18 pages

rl

rl

18 pages

mle

mle

16 pages

mdp

mdp

19 pages

knn

knn

11 pages

intro

intro

19 pages

hmm-train

hmm-train

26 pages

hmm

hmm

28 pages

hmm-train

hmm-train

26 pages

hmm

hmm

28 pages

ensemble

ensemble

17 pages

em

em

17 pages

dtree

dtree

41 pages

cv

cv

9 pages

bayes

bayes

19 pages

vc

vc

24 pages

svm-2

svm-2

16 pages

svm-1

svm-1

18 pages

rl

rl

18 pages

mle

mle

16 pages

mdp

mdp

19 pages

knn

knn

11 pages

intro

intro

19 pages

vc

vc

24 pages

svm-2

svm-2

16 pages

svm-1

svm-1

18 pages

rl

rl

18 pages

mle

mle

16 pages

mdp

mdp

19 pages

knn

knn

11 pages

intro

intro

19 pages

hmm-train

hmm-train

26 pages

hmm

hmm

28 pages

ensemble

ensemble

17 pages

em

em

17 pages

dtree

dtree

41 pages

cv

cv

9 pages

bayes

bayes

19 pages

vc

vc

24 pages

svm-2

svm-2

16 pages

svm-1

svm-1

18 pages

rl

rl

18 pages

mle

mle

16 pages

mdp

mdp

19 pages

knn

knn

11 pages

intro

intro

19 pages

hmm-train

hmm-train

26 pages

hmm

hmm

28 pages

ensemble

ensemble

17 pages

em

em

17 pages

dtree

dtree

41 pages

cv

cv

9 pages

bayes

bayes

19 pages

hw2

hw2

2 pages

hw1

hw1

4 pages

hw0

hw0

2 pages

hw5

hw5

2 pages

hw3

hw3

3 pages

20.mdp

20.mdp

19 pages

19.em

19.em

17 pages

16.svm-2

16.svm-2

16 pages

15.svm-1

15.svm-1

18 pages

14.vc

14.vc

24 pages

9.hmm

9.hmm

28 pages

5.mle

5.mle

16 pages

3.bayes

3.bayes

19 pages

2.dtree

2.dtree

41 pages

1.intro

1.intro

19 pages

21.rl

21.rl

18 pages

CNF-DNF

CNF-DNF

2 pages

ID3

ID3

4 pages

mlHw6

mlHw6

3 pages

MLHW3

MLHW3

4 pages

MLHW4

MLHW4

3 pages

ML-HW2

ML-HW2

3 pages

vcdimCMU

vcdimCMU

20 pages

hw0

hw0

2 pages

hw3

hw3

3 pages

hw2

hw2

2 pages

hw1

hw1

4 pages

9.hmm

9.hmm

28 pages

5.mle

5.mle

16 pages

3.bayes

3.bayes

19 pages

2.dtree

2.dtree

41 pages

1.intro

1.intro

19 pages

15.svm-1

15.svm-1

18 pages

14.vc

14.vc

24 pages

hw2

hw2

2 pages

hw1

hw1

4 pages

hw0

hw0

2 pages

hw3

hw3

3 pages

9.hmm

9.hmm

28 pages

5.mle

5.mle

16 pages

3.bayes

3.bayes

19 pages

2.dtree

2.dtree

41 pages

1.intro

1.intro

19 pages

Load more
Download VC-Example2-solutions
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view VC-Example2-solutions and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view VC-Example2-solutions 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?