Charlie Media
Charlie Media @CharlieMediax ·
1️⃣ Shannon entropy measures the average information (uncertainty) in a random variable: H(X) = −∑ p(x) log₂ p(x) More unpredictability → more bits needed to describe the data. This formula is the backbone of information theory. #Math #AI
1
9
Patricia Vargas
Patricia Vargas @PatriciaIVargas ·
Replying to @H0H0v
@H0H0v H = (4/3)(pi) Per special-right triangle ratios… 30-60-90 : 1-rad3-2 30-60-90 : shortleg-longleg-hypotenuse 30-60-90 : shortleg-2-hypotenuse 2 = shortleg(rad3) 2/rad3 = shortleg 2/rad3 = radius Area = pi(radius^2) Area = pi (2/rad3)^2 Area = pi (4/3) units^2 #math fun
40
Patricia Vargas
Patricia Vargas @PatriciaIVargas ·
Replying to @H0H0v
@H0H0v H = D.N.E. Like others, this visual math prompt pleasantly stumped me given neither simplification or substitution (e.g., 1, 0, the infinities) appear to yield an existing solution For example… H^2 / H = H But H = -H is illogical H^2 = - (H^2) is also illogical #math
473
Patricia Vargas
Patricia Vargas @PatriciaIVargas ·
Replying to @H0H0v
@H0H0v Area = 8 cm^2 Applying formula for area of a triangle, concept of bisection, and substitution… (1/2)(b)(h) = 32 cm^2 (b)(h) = 64 cm^2 (1/2)(b/2)(h/2) = Area (1/8)(b)(h) = Area (1/8)(64 cm^2) = Area (64/8) cm^2 = Area 8 cm^2 = Area #math fun :-)
48
Charles | Maths Educator
Charles | Maths Educator @NnachiOluwatobi ·
Want to do well in Mathematics? ➗📐 Remember MATHS: M – Master the basics A – Always practice T – Think before solving H – Have patience S – Solve past questions Mathematics is not about being a genius — it’s about practice, patience, and persistence. Nnachi Charles ✍️ #mathX
6
Felix Marin
Felix Marin @fpmarin ·
Replying to @MasterNotesX
@MasterNotesX #math Bernoulli: v_arr^2 + 2gy = v_ab^2; Continuidad: v_arr A_arr = v_ab A_ab. Con ellas, calcule el tiempo que tarda en bajar con dy/dt = v_arr y v_arr(0) = h. La presión atmosférica tiene un efecto despreciable a esas diferencias de altura: Ignórela.
1
156
Charlie Media
Charlie Media @CharlieMediax ·
1/🧵 One equation quietly powers the digital world: H(X) = −∑ p(x) log₂ p(x) Shannon’s Entropy. It measures uncertainty — in bits — and sets the fundamental limit of data compression. #Math #AI 🧵👇
1
20