# Probability theory # Cross entropy: theoretical part of project # Define each condition, formula and ideas and fundame

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

# Probability theory # Cross entropy: theoretical part of project # Define each condition, formula and ideas and fundame

Post by answerhappygod »

# Probability theory
# Cross entropy: theoretical part of project
# Define each condition, formula and ideas and fundamental with
broadly and explain briefly because it is 16 marks
question
Probability Theory Cross Entropy Theoretical Part Of Project Define Each Condition Formula And Ideas And Fundame 1
Probability Theory Cross Entropy Theoretical Part Of Project Define Each Condition Formula And Ideas And Fundame 1 (133.9 KiB) Viewed 105 times
The log-likelihood function L((f1,..., fx), (P1, ... , Px)) = Li-1 filog pi that you (hopefully) obtained in final project is also know as cross entropy. It is extremely popular in machine learning, namely, in classification problems. It allows you to measure how good probability distribution (P1, ..., Pk) fits the actual absolute frequencies obtained from the data (f1,..., fk). Assume that frequencies (f1,..., fk) are fixed. What is the best distribution (P1, ..., Pk) from likelihood's perspective? Intuitively, it seems that we have to put relative frequencies

ri = fi Σ=1f, ਹ as pi to get best fit. In fact, it is true. To prove it, let us use Jensen's inequality for logarithms. It is stated as follows: For any values Q1, ..., Qk, such that –1Q; = 1 and a; > 0 for all j = 1, ... , k, and any positive values X1, ..., Iki the following inequality holds: log Σα;; Σ Σα; log(α;).

Use this inequality to prove that {j=1"; log P; - j=1"; logr; 50, then prove that to obtain maximum log-likelihood (and therefore maximum likelihood) for fixed (f1, ..., fk) we have to put pi = ri,i= 1,..., k. Hint. Use properties of logarithm to transform left-hand part of the last inequality to right-rand part of the previous inequality
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply