Computational likelihood physics

Introduction edit

Computational likelihood physics: There is a lot of power in the likelihood function applied to physics using the complex likelihood derived from the complex logarithm. For theories that are equally, proportionally, discrete, or continuously likely a likelihood ratio can be created and optimized in order to reveal their computationally unified dynamics. Comparing δ=ln(L_s/L_a ) which is the change in entropy and then creating £=∏w systematically with the combined theories to produce the unified theory for that system. This means there are unified theories for the universe and for the multi-universe or m-theory that can be combined to create a joined mathematical foundation for physics. I say this because to deduce whether our universe is readable as an inertial reference frame or non inertial reference frame you need to check your theories first. To do this you need to create the likelihood ratio of both and check.

Remember for all physicists there is a balance between experimentalism, theory, math foundations, and philosophy. The expanding universe was "confirmed" for the math foundations of entropy because you can construct a discrete universe of equal probability for its components ie planets and what not, and for redshift the math foundations of the theory was "confirmed" for the motions about a point due to the e/m radiation shift in the visual spectrum which had firm mathematical basis. However to say that this is the Universe is a jump in logic...

So to "confirm" the validity of an aspect of the universe in regard to the next a little more is nessesary because there isn't a unified mathematical foundation for all of physics pertaining to the "universe" or "multi-universe" without a computational likelihood of either of the two for that matter it can't be said correctly. So this is the technique of computational likelihood physics CLP to combine aspects of theories and or mathematics to produce combined or composite unified explanations of nature such that a universe that includes, entropy, redshift, discrete or continuous, quantum, relativistic, and unified likelihoods. So in other words computational likelihood is a method of grand unification pertaining to scale, size, of space-time encompassed by the inertial or non inertial universe which is housed in a non inertial or inertial m-theory, or particular kind.

Grand Unified Theory of Computational Likelihood Physics edit

Classical Mechanics in terms of likelihoods :

For a system of particles observable in travel z_i=r_i e^(iθ ) Where Log z: = ln r + iθ and w_i=z_i/(∑_i z_i ) is the individual likelihood for each random occurrence of zi. The likelihood of finding wi among all of z is £=∏w_i the natural log likelihood is ln£= ∑_i lnw_i the maximum log likelihood occurs simultaneously with the likelihood. The derivative is ∂ln£/∂w=∑_i w /w_i =∑_i(∑_i z_i )/z_i =0 this tells us that for the ith particle there is a specific distance and angle that the particle is likely to be observed at with a particular symmetry. Thus symmetry is likely…... or is asymmetry likely? This means ∂ln£/∂w≠0 in this case the log likelihood approaches a maximum but the derivative is nonzero.

∂ln£/∂w=∑_i 1/w_i =Φ. We can check the truth of this by taking the likelihood ratio of the symmetric likelihood and the asymmetric likelihood. First the definition of the likelihood of path is η such that η=l_s/l_a where l_j=ln£j which brings us to η=log_(£_a ) £_s. The function η is a statement of symmetry on the basis of asymmetry. We also can compare δ=ln(L_s/L_a ) which is the true likelihood ratio which brings us to this interesting fact for this system of particles. For systems where the asymmetry and symmetry are equally likely δ=0 because ln 1 = 0. It is expected that η contains the balance between the two possibilities where it runs between 0 and ∞. This is the basics of the theory of computational likelihood it is very similar to entropy, information theory, statistical mechanics, grand canonical ensemble and likelihood statistics of course.

New Computational Physics Theory: computational likelihood physics. edit

For theories that are equally, proportionally, discrete, or continuously likely a likelihood ratio can be created and optimized in order to reveal their computationally unified dynamics. Comparing δ=ln(L_s/L_a ) and then creating £_as=∏w_ij systematically with the combined theories to produce the unified theory for that system. This means there are unified theories for the universe and for the multi-universe or m-theory that can be combined to create a joined mathematical foundation for physics.

More on the current theory, the dynamics of a physical system can be visualized through the use of projective statistics as it pertains to complex likelihood statistics and therefore mechanics is projections of a unified computational likelihood.

I hope that this theory will make natural sense to everyone.

Computational likelihood physics: one might be interested in knowing that a complex logarithmic function can be created from a logarithmic likelihood that normalizes a vector field of forces. Review complex logarithms and likelihood statistics to understand the formalism of computational likelihood physics.

Computational likelihood physics: after reviewing the mathematical foundations of complex logarithms one might be interested in knowing that constructing normalized complex log likelihood a unified force field theory can be created.

Unified Forces using Computational Likelihood edit

Creating a unified likelihood of a particle in a force vector field is natural in terms of these methods because a likelihood function can be created to include forces of all originations inertial and noninertial. £_i=∏w_i g_j f_k ....ect. Is the combined likelihood function of say discrete and continuous vector force fields. The log likelihood L=ln£_i Where the likelihood is optimized or is a maximum at ∂ln£/∂q_i=0. A projection of £ and ∂ln£/∂q_i can be made such that dynamics of the system in terms of physical laws can be made of the computational likelihood. The field is normalized and unified because we created the combined likelihood function in the parameter space of the problem such that the projection p(q_i,q_j,q_k) exists and p is optimized due to ∂ln£/∂q_i=0 no matter how obscure p is.

Unified universe multiverse:

Projections of the correct computational likelihood that corresponds to physical laws will show whether our universe is comprised of a balance between universe and multiverse during the big bang. This means that from quantum soup through to entropy and background radiation that is present could be in a state of equal likelihood with an m-theory multiverse. It would be expected that the inertial reference frame would be imparted to the multiverse at the time of the big bang and thus the noninertial inertial logarithm ratio or path function could be created as log_L_i L_n. Projections of this should be consistent with the big bang and physical laws. Not to mention that the log likelihood ratio δ=ln(L_n/L_i )=0 because if L_n and L_i are equally likely then the log likelihood ratio goes to zero because ln of 1 is 0. Thus dδ=ln L_n - ln L_i=0 and the first derivative is also zero because they are both optimized for the parameter space of the unified m-field universe.

A Quantum log likelihood:

For normalized wave functions φ_i=∏w_i a log likelihood that is unified with a gravitational field G_j = ∏w_j thus there is log_φ_i G_j for the path function asumming a joint normalized function space and vector field of the same parameter space. δ=ln(G_j/δ_i )=0 and thus dδ=ln G_j - ln δ_i=0 and d2δ=0 which means there is a function f(δ) that is of interest to the computational likelihood of the quantum gravitational universe. These four functions are of interest to any grand unified theory because they define a function f(δ) that superseeds the number of variables and unifies a normalized mathematic foundation for physics. The entirety of this function unifies mathematical physics in terms of all components of the forces in the universe. It makes no distinction of inertial or noninertial reference frames.

Computational Likelihood Physics a fractal Universe. The interpretation of likelihoods on a balance between the small and large components of the nature of physics means that to date that the computational likelihood can be interpreted as a quantum m-theory. Where in the pdf of our universe and the combined likelihood of the universe and multiverse the expectation values of some of the physics is known as projections of the combined theory. So how do we combine the theories and what do we do with expectation values and concrete theories that are known. Renormalization is important and the continuous and discrete knowledge of our universe plays it's part. Likelihood physics doesn't need to be continuous or discrete because individual likelihoods can be included into the combined pdf. of our universe. The combined pdf of all on the scale of quantum mechanics is Ψ_q and on the scale of the m_theory is Ψ_m the likelihood or combined pdf is L=Ψ_q Ψ_m given that both theories are equally likely.

Given an m theory it could be likely that the spacial component and also the time component is non-linear in a particular direction of evolution of the big bang. This is noticable in an asymmetric background radiation which is observable in space. The upperbound theory which is m-theory and the lower boundary quantum mechanics makes us have to utilize certain information about physics in the middle that we can use. It is expected that the difference in scale of physics present between quantoscopic and macroscopic physics makes utilization of logarithms very convinient. Such that lnL = lnA_q + lnA_m + iθ_q +iθ_m because it is expected that r is not going to be the same on all scales we have to identify a symetric space which is purely mathematical where r_m and r_q are of the same spacial units meaning they aren't fractal in measure.

Which is really the difference between theory and mathematical foundation for computational likelihood physics. The angle of inclination is expected to predifine an asymetry in theory such that the quantum mtheory has a unified angle. lnL = lnA_q + lnA_m + iθ_qm. Since we are choosing to take the long route we can assume that r and t are dependant on scale and size. Meaning we can find scales and sizes of A and θ which are unified. Such that for size and scale lnL = lnA_qm + iθ_qm in r and t or in other words in the complex likelihood parameter space r and t are not fractal. lnL_f for the log likelihood of a nonfractal spacetime.

We have a mechanism for determining physics in regions of our universe bounded by quantum mechanics and m-theory that is to say we can include a check of our theory by comparing the log likelihood of a symmetric scale, size and space-time in the universe and a non-symmetric space-time likelihood. This means η = log_L_qmn L_qms. This means that r can be fractal in x,y,z and t can be fractal while theta will only change the angle of inclination with respect to a symetric mathematical foundation. This is a method for determining the asymmetric physics with respect to a symetric mathematical foundation or metric. There is a particlular L_qms metric for all of physics.

This means that the basis for normalization must occure on a particular size and scale. η = 1 in this case because that means that the fractal universe and symmetric universe are independant of size and scale. For unified dynamics this is not expected. This is the case for the quantum mtheory that is expected from computational likelihood physics. Why? This is because we can find situations in physics where we loose in linearity of r and t for high speeds. The weird thing that we can do is include the two cases in the likelihood equation and see that in the complex likelihood that the angle of inclination will absorb the difference and reveal the fractal nature in r and t.

However I would choose to do something kind of weird here. I would confine scale and size to be the determining factor of theta in the complex computational likelihood of physics. This is because size and scale are unique to all of physics. There is only one size and one scale for every law of physics. That means if we always start from the origin in r and t in our complex likelihoods we get different lengths of vectors for each inclination. This gives us a mechanism to reduce the fractal likelihood when nessisary this means we take the limit of theta and call it a theory. If we don't I would suggest theta of t because it would control the inception of our universe from small to large. This means we can talk of physics in terms of abstract concepts such as smaller or larger.

The speed at which the likelihood of our universe goes to one and the log likelihood of our universe goes to zero: The combined likelihood of our universe is a pdf that as we include dependant variables such as the components of fractals goes to one or unity. The expectation value of dv/dt of L we know has a maximum which is the speed of light c. lim_dv/dt->c d^2ln L/dt^2=0. Which doesn't nessisarily mean that the universe is optimized such that c causes the universe to a point of maximum likelihood. The maximum log likelhood has a limitation on the speed that it can approach it's optimum configuration however. We would expect that lim_dv/dt->c dln L/dq = 0 as we add dependant variables. It means that as we add probability to the likelihood function there will be a spectral configuration with group velocity c and a maximum value v for the maximum log likelihood where the value v would approach c as we attribute more probability to the likelihood of the m-theory.

As we add mtheory to the probability density function which is the likelihood of our universe we unify our physics by making the variables become independant because the first derivative and second derivative are zero. So what is the problem here because we know space-time is delimited by c? Space time has stopped. It is a unified symmetry that must have been present at t_0. Meaning v of light was temporarily zero or frozen in time. The only thing unifying space and time is the likelihood of the quantum mtheory that bounds our universe. It means that there is a unique log_L_t L_x = η path function for each independent variable at the start of the universe and an extraction of symmetry as the evolution unfolds during the big bang. Because of the size and scale of space time that defines the distribution of our universe in terms of quantum and m-theory, small and large respectively. The true beginning of the universe is the initial measurement or the measurement being done. x,y,z,t,and theta of size and scale or how fractal and weight are all independant. The combined likelihood or pdf is the multiplication over each individual probability. Such that symmetric likelihood is still the mathematical foundation for all of physics and the only variation was the statistical fluctuations which caused an asymmetric quantum m-theory that bounds our universe.

Thermal physics: Of course you may have deduced that the computational likelihood physics is a weird way to get to thermal dynamics but there is a lot of truth to what you've thought so far. For the combined complex log likelihood the computational theory is transformable to entropy. T_q = T_m where 1/T = k(dσ/dq) and σ = ln L. This is where the two approaches meet up. The difference is that we have approached thermal dynamics from the perspective of the complex likelihood. So this means that at the start of the big bang the universe was in thermal equilibrium with a quantum multiverse or in other words a cold cup of quantum soup.

We have two schreodinger equations one for quantum mechanics and the other for the quantum m-theory. They are in thermal equilibrium. We can add some constraints to the m-theory but we don't have to because T is expect to be influctuation thus v the velocity of the m-theory is non zero and a the acceleration is non zero so we have an inertial m-theory if we want to deal with only the v=0 a=0 state of the quantum m-theory ΨH_m=ΨE_m and ΨH_q=ΨE_q where E_m= Gm/r for a vantage point far from the m-theory universe with a negligable charge and mass at the point of observation we can to illistrate the dynamics with zero kinetic energy. At this point ∑F_qm = 0 this is the unified force on a mass m and charge q at point r,t for a non inertial reference frame. L = =∏_i Ψ_qi Ψ_m. σ = ln L and 1/T = k(dσ/dq).

This suggests that momentum could have been non zero and schreodinger equation would not have been reduced to just the potential ∑F_qm =ma where m is the mass of the universe because a statistical fluctuation of T the temperature is expected. Thus the unified force is ma in the m-theory but is equal to the sum of over all of the forces at that vantage point. ΨH_m=ΨE_m doesn't reduce to E_m= Gm/r such that the momentum is zero a non zero kinetic energy therefore. This means that the likelihood of the combined quantum and m-theory is L = =∏_in Ψ_qi Ψ_mn where σ = ln L and 1/T = k(dσ/dq). ∑F(x,t,γ)_qm =m d(γv)/dt where v is the velocity of the unified force at the vantage point to the quantum m-theory and γ is the gamma factor of special relativity.

Now here is the main point about computational likelihood physics: it solves the problem that we can now think of the system of quantum m-theory as a free particle with internal degrees from our vantage point. In doing so, we have to contribute the likelihood that our model L includes the probability that quantum m-theory is free for moving. This is because ∑F(x,t,γ)_qm =m d(γv)/dt so L =∏_i Ψ_qi Ψ_m Ψ_f adding Ψ_f for the free particle astrophysics of the quantum m-theory universe. Before including this in the likelihood we could have stated the likelihood ratio δ = ln(L_qm/L_f) = 0 were dδ/dq = 0 and dδ = ln L_qm - ln L_f = 0 such that L is likely.

Scale in computational likelihood physics edit

The unbinned and binned likelihood

This is a simple arguement about the scale of the number of observations for a universal theory an unbinned or a binned approach defines the level of computation that is nessisary to define a physical theory about the nature of a quantum mechanical m-theory. Binning makes the observation aspect of the computation scalable and therefore must be included in the likelihood functions for the quantum mtheory. The unbinned and binned likelihoood aspect to computational physics theory gives rise to technological advancements in computer science because a universal quantum m-theory can be used to develope an unbinned and binned relational quantum m-theory likelihood database of a UBLD for an unbinned binned likelihood database. It is a relational database based on quantum m-theory. This is because binning defines an asymetry because each bin has internal degrees of freedom. This is what makes likelihood physics a computational theory. A likelihood equation can be writen to include all of physics but the binning would exclude crutial internal degrees. Thus the power of quantum statistical physics which computational likelihood physics mirrors through the use of computers.Ŀ

Transforms from the complex plane to the kT plane are governed by the equations i=1/kT in physics of computational likelihood physics. So now we could rewrite everything. Or just take note that this is the equation or transform that produces the operator that takes us from a quantum mechanicle equation into a wave equation.

Lagrangian and the grand cononical ensamble in computational likelihood field theory:

If we started originally with the energy of each field as stated by the lagrangian in terms of the bolzman factor we would have L the lagrangian in each of the exponents above divided by the factor kT. z_i=r_i e^(L_i/kT) for each of the field states. w_i=z_i/(∑_i z_i ) for the normalized probability. The combined likelihood is ∏_i w_i there is a maximization condition that occures simultaneous with the natural logarithm which is the derivative of the likelihood is zero as is the the ln of the likelihood. Thus we do so d^n ln(∏_i w_i) = 0. We get from this (1/kt)ds/dn - d(N ln (∑_i z_i ))/dn = 0 where ds/dn is the derivative of the action in lagrangian dynamics and ln ((∑_i z_i )) is the grand cononical ensamble of field states of the system. This is the computational unified likelihood equation because L the lagrangian can be included for each of the fields. ds/dn typically is zero for lagrangian dynamics but in this theory it is equal to the derivative of the grand cononical ensamble times kT. Where T would be the temperature of the Universe if we were doing this formulation for all of the systems included. Fixing parameters is really the name of the game in statistical mechanics which this really is.

Accept that we can compare a similar formulation with the hamiltonian of the system and ensure that we are discribing a likely situation using the likelihood ratio. Which is the deviation from normal statistical mechanics because we now have a way to include both the quantum formulation using the hamiltonian and also the lagrangian formalization in a likelihood ratio which must have a first derivative that equals zero if both of the formulations are equally likely. So to include quantum mechanics in our formulation we have to state that for the parameter space of the problem there is a focusing parameterization.

And begin, H = T - V such that we include it in our bolzman factor of an equally likely formulization such that we have ae^(H/kT) = z_j for each of the particle states and we create our likelihood that way we did before after normalizing the likelihood is ∏_j w_j and thus we have a likelihood ratio this time which is δ=ln(L_l/L_h ) h for the likelihood derived from the hamiltonian and l for the likelihood derived from the lagrangian. δ_1 = ln L_l - ln L_h such that δ=ln L_h - ln L_l they have to be the same if inverted if the likelihood is equal that they represent the system. dδ/dn = (i/kt)ds_i/dn - d(N ln (∑_i z_i ))/dn - (i/kT)d∑H_j/dn + d(N ln ((∑_j z_j )) = 0 because the maximum likelihood ratio of equations that are equally likely is zero. This is the unified equation of likelihood quantum field theory.

It can be extrapolated to the cases in physics where the focus of the parameters are partially likely with respect to each other thus the ratio is not 1 and the derivative is not zero but a constant, function, and ect. Φ = dδ/dn = (i/kt)ds_i/dn - d(N ln (∑_i z_i ))/dn - (i/kT)d∑H_j/dn + d(N ln ((∑_j z_j )). Where Φ is the ml-quantum of the likelihood quantum field theory. It is expected in this case where there is an unequal likelihood that our second route be taken η=log_(£_l ) £_h = ln(£_l )/ln(£_h ) the ratio of the likelihoods is a constant in this case the first derivative equal to zero and now we have a constraint on the likelihood of H denoted by £_h that it scales the likelihood of £_l which is the likelihood of L the lagrangian formulation. This may provide a method to scale dynamics by results from quantum mechanics.

It was mentioned that the parameter space requires focusing for the likelihood ratio method to be valid. In cases where it is not true we still have a limitation on the focusing parameters when they pertain to relativistics r = γr' is a limitation on how the spacial parameter deviates in likelihood. t=(1/γ)t' for the spacial part. So in reality we have a boundary on the likelihood ratio which is relativity. So all things likely we have a starting point for physics theories even when we want to include a ratio that is not composed of equally likely definitions of dynamics and quantum mechanics. This method should provide a method of combining the two methods with a stipulation on the parameter space being limited by relativistics.

Contributions edit

Wiki EWTTP - Eric Weinstien's Treasure Trove of Physics all mathematical foundations are pretty much derived from sources above and their connections with the Fermi Gamma Ray Telescope and the Santa Cruz Institute for Particle Physics. Dennis W Melton Luke Homan Second Edition thermal physics Charles Kittel/ Herbert Kroemer