Showing posts with label Science. Show all posts
Showing posts with label Science. Show all posts

Wednesday, June 21, 2017

Black Hole Computer Simulations Help to Identify Third Gravitational Wave

Gravitational Wave Signal

Significant contributions to gravitational wave astronomy continues with Rochester Institute of Technology researchers with the third detection of gravitational waves as well as a fresh black hole which is 49 time the size of the sun. The results from the detection of gravitational waves on January 4, was recently announced by the LIGO collaboration and the finding would be published in the journal Physical Review Letters.

 RIT scientists had supported the collaboration measure as well as interpreted black hole spins together with their alignment. These measurements could tell scientists on what occurred when massive stars tend to die and transform into black holes. The gravitational wave signal had been developed from the collision of black holes. The freshly located black hole which had been formed by the merger had a mass of around 49 times that of the sun.

This tends to fill in a breach between the masses of the two merged black holes detected earlier by LIGO, with that of the solar masses of 62 – the first detection and 21 – second detection. Richard O’Shaughnessy, associate professor in RIT’s School of Mathematical Sciences had commented that they can see the outlines of a population of black holes emerge.

Combination of Heavy Black Holes – Plenty of Net-Aligned Spin

The LIGO paper quotes O’Shaughnessy’s upcoming research that had been utilised in part to assist in interpreting the latest event. He had presented his work at the American Astronomical Society in Austin, Texas. He noted that the fresh observations rule excludes the possibility that combination of heavy black holes tend to have plenty of net-aligned spin and are in agreement with the breakthrough observation of 2015 of LIGO.

O’Shaughnessy stated that either since heavy black hole spins seems to be small or due to they being tilted, their net effect cancels out. Various teams have made diverse predictions for black hole spins and the most extreme forecasts are ruled out. With regards to the others, it is only a matter of time. Centre for Computational Relativity and Gravitation, a research at RIT is developing the techniques for comprehending the crucial astrophysical parameter spin according to Professor Carlos Lusto, in the School of Mathematical Science of RIT.

 Lousto had commented that at RIT, they are working on enhancing the spin measurements of the black holes by resolving to high precision the equations of Einstein and directly comparing those theoretical forecasts with those of the observed signals of LIGO.

Gravitational Ways Moves at Speed of Light

At the Centre for Computational Relativity and Gravitational of RIT, researchers have openly linked the wave signals to their computer simulations considering the equations of Albert Einstein. The latest interpretations have further tested the general theory of relativity and prediction of Einstein that gravitational ways tends to move always at the speed of light.

 No confirmation has been seen by LIGO that the waves had travelled at different speeds. According to Manuela Campanelli, director of the Centre for Computational Relativity and Gravitation and Frontier in Gravitational Astrophysics, which is an RIT signature research area, this third event is in a mass range intermediate to the earlier two events showing that the black hole mergers are common in the universe.

The members of the RIT of the LIGO Scientific Collaboration comprise of John Bero, Hans-Peter Bischof, Manuela Campanelli, James Healy, Brennan Ireland, Jacob Lange, Carolos Lusto, Rupal Mittal, Richard O’Shaughnessy, Monica Rizzo, Nicole Rosato, John Whelan, Andrew Williamson, Jared Wofford, Daniel Wysocki, and Yuanhao Zhang together with Yosef Zlochower.

The international collaboration tends to have around 1,000 members who seem to perform the research of LIGO along with the European-based Virgo Collaboration. The new discovery had taken place at the time of the prevailing observing run of LIGO that had started on November 30 2016 and would be continuing through the summer.

Interpretations of LIGO – Twin Detectors

The interpretations of LIGO had been carried out by twin detectors, one in Hanford, Wash while the other was in Livingston, La. It was operated by Caltech and Massachusetts Institute of Technology with subsidy from the National Science Foundation. The first undeviating surveillance of gravitational waves by LIGO was done in September 2015 while the second discovery was in December 2015.

The LIGO breakthrough paper obviously quotes 2005 landmark research done by Campanelli together with her team on binary black hole mergers and considering this milestone work, Lusto together with Healy statistically revealed the merger of a pair of black holes with replicated gravitational waveforms which seemed to match the first detection of LIGO.

Subsidized by the National Science Foundation, LIGO was operated by MIT and Caltech, conceived and the project was constructed. Monetary support for the Advanced LIGO project had been led by NSF with Germany – Max Planck Society, the U.K. – Science and Technology Facilities Council and Australia – Australian Research Council which made substantial commitments as well as contributions to the project.

Over 1,000 scientists from across the globe had participated in the work through the LIGO Scientific Collaboration that comprised of the GEO Collaboration. The partners of LIGO with the Virgo Collaboration which is a consortium comprising of 280 additional scientists all over Europe had been supported by the Centre National de la Recherche Scientifique, the Istituto Nazionale di Fisica Nucleare and Nikhef and Virgo’s host institution, the European Gravitational Observatory.

Wednesday, May 31, 2017

Mining the Moon Could Give Us Enough Rocket Fuel- Says Student Researcher

mining of moon

2017 Caltech Space Challenge

It’s been forty five years since man had last set foot on space and now the moon is said to be the focus of effort for not only exploring space but alsoto develop aenduring, independent space managing society.Scheduling an expedition to the nearest celestial neighbour of Earth is now no longer just an effort of NASA but the space agency of U.S has ideas for a moon-orbiting space station which would be serving as a performing ground for Mars mission in the early 2030s.

 A joint project – the United Launch Alliance between Lockheed Martin and Boeing has scheduled a lunar fuelling station for spacecraft with the potential of supporting 1,000 people living in space within 30 years. Billionaires Elon Musk, Jeff Bezos as well as Robert Bigelow have companies focussed on delivering people or goods to the moon.

 Many teams competing in partaking in the share on the cash prize of US$30 of Google are intending to launch travellers to the moon. Recently a few together with 27 other students from across the world had contributed in the 2017 Caltech Space Challenge suggesting designs on what a lunar launch and supply station for deep space mission could appear like and how it would function.

Moon – One-Sixth of Gravity of Earth

Presently all space missions depend on and are launched from Earth though its gravitational pull seems to be strong. In order to get into orbit, the rocket needs to be travelling 11 km a second – 25,000 miles per hour.

A rocket leaving Earth needs to carry the fuel essential to reach its destination and if needed to return back again. The fuel is said to be heavy and moving with it at such high speeds would take a lot of energy. If the refuelling is done in orbit, the launch energy could lift more travellers or cargo or scientific equipment in orbit.

The spacecraft could then refuel in space where the gravity of the Earth seems to be less powerful.The moon is said to have one-sixth of the gravity of Earth making it a pretty substitute base. Besides this, it also has ice wherein we are aware how to process into a hydrogen-oxygen propellant which can be utilised in several recent rockets.

The Lunar Reconnaissance Orbiter and Lunar Crater Observation and Sensing Satellite missions of NASA have now found considerable quantity of ice in long-lasting shadowed craters on the moon.


These would be tricky locations to mine since they tend to be colder and have no sunlight to power wandering vehicles. But big mirrors on the rims of the craters could be installed in order to illuminate solar panels in the permanently shaded regions.

Travellers set to launch in 2020 from Lunar X Prize competition from Google and Lunar Resource Prospector from NASA would also be contributing in locating good areas to mine ice. Based on where the best reserves of ice seem to be, it would be essential to build many small robotic moon bases. Each would mine ice, manufacture liquid propellant as well as transfer the same to passing spacecraft.

A plan of accomplishing these tasks with three various types of ramblers had been developed by the team. The plan also needed some small robotic shuttles in meeting up with nearby deep-space mission vehicles in lunar orbit. One of the rover known as Prospector would be exploring the moon and locate ice-bearing areas.

A second, the Constructor would be following along behind constructing a launch pad and packing down roadways to ease movements for the third kind of rover, the Miners. This would in fact amass the ice, delivering it to the nearest storage tanks and an electrolysis processing plant which tends to split water into hydrogen and oxygen.

Landing Pad – Lunar Resupply Shuttles

Moreover, the Constructor would also be erecting a mooring pad wherein small near moon vehicle spacecraft called Lunar Resupply Shuttles would be arriving for the collection of fuel for delivery as lately launched spacecraft pass by the moon.

The aircrafts would utilise the moon-made fuel and have advanced guidance together with navigation systems for travel between lunar bases and their object spacecraft. With the provision for adequate fuel being produced and the aircraft delivery system is tested and dependable, their intentions are for building a gas station in space.

The aircraft would be delivering ice straight to the orbiting fuel depot after which it would be processed into duel and where the rockets heading to Mars or any other area could dock to refill. The depot would be equipped with large solar arrays powering an electrolysis module for the purpose of melting the ice and later turning the water into fuel together with huge fuel tanks for storing what is prepared.

NASA has been operational on most of the technology essential for a depot such as this comprising of docking and fuel transfer. A working depot is expected to be ready in the early 2030s in time for the first human missions to Mars.

Earth-Moon Lagrangian Point – L1

The depot needs to be placed in a stable orbit probably near the Earth and the moon to be most useful as well as effective.The Earth-moon Lagrangian Point 1 – L1 is said to be a point in space around 85% of the way from Earth to the moon wherein the force of the gravity of the Earth would match the force of the gravity of the moon pulling in the other direction.

It seems to be an ideal pit stop for a spacecraft going on its way to Mars or the external planets. The team also located a fuel-efficient system of getting spacecraft from Earth orbit to the L1 depot which needed much less launch fuel and reducing more lift energy for cargo items. Initially, the spacecraft would be launching from Earth into Low Earth Orbit with empty propellant tank and then it could be towed with its cargo from Low Earth Orbit to the L1 depot utilising solar electric force tug, which is a spacecraft largely propelled by solar-powered electric thrusters and this would triple the payload delivery to Mars.

By assisting us in escaping the gravity of the Earth as well as its necessity on the resources, a lunar gas station can be the initial move to the big leap in making humankind a space civilization.

Saturday, January 28, 2017

Gravity Waves On Venus Sparks Interest As Researchers Weigh In

Venus might be the closest look-alike to Earth we can get in our Solar system. It is likely dubbed as Earth’s sister planet due its similar size, mass, proximity to the Sun and composition. But that’s not why it has caught the eyes of scientists from all over the world. Huge waves have been spotted over the Venus atmosphere and what’s more mysterious is that these patches are stationary even though the clouds in the Venus atmosphere moves 100 meters per second.

What could be causing these unnatural patches? 

Since its discovery scientists and researchers have come forward with theories and the most plausible among them relates to the phenomenon widely observed on Earth and named as “Gravity waves”. These waves stretching over 6200 miles were created in the atmosphere just above the mountains by gravitational pull from the mountains as they flew over them. This would be a first time that these waves are observed on a planet.

What are Gravity waves? 

Gravity wave is an atmospheric phenomenon commonly seen on Earth surface that is laid with mountains. Roughly speaking they are formed when air ripples over rough and bumpy surfaces. The waves then move upwards and grow larger and larger in amplitude until they break just below the top of the clouds. It is speculated that the tall mountains over the Venus surface serve as a key factor in creating these giant gravity waves. Though the mechanism of the wave formation between Earth and Venus are quite different, the underlying principle remains intact. This feature which denotes atmospheric flow over the mountains should not be confused with “gravitation wave” which are time ripples in early universe.

These baffling spots were taken by Japanese Aerospace Exploration Agency’s Akatsuki Spacecraft as it entered the Venus orbit in late 2015. Researchers from Rikko University in Tokyo studied these waves and published their research in the Natural Geoscience Journal this year. These stationary waves lasted for quite some time in the Venus Atmosphere and were seen between December 7 and 11. Researchers added that the most interesting and unique feature of this phenomenon was it being stationary and remaining at that exact same geographical spot despite the super rotation of the Venus atmosphere at speeds more than 100 meters.

Wednesday, December 28, 2016

Watch the Earth Change before Your Eyes


Landscapes/Features of Earth Undergone Dramatic Change

Several of the landscapes and features of the Earth have undergone a dramatic change since 1984. Google has made its prime update to Time-lapse with addition of four years of imagery, big amounts of new data together with a sharper view of the Earth from 1984 to 2016.

Latest images by Google’s Time-lapse application portrays how the features comprising of Alaska’s Columbia glacier and Dubai’s extensive cityscape tend to have drastically progressed in the last 32 years. Its Time-lapse visualisation of Earth, had first been released by Google in 2013, offering the most comprehensive image of our changing planet, made available to the public.

The communicating time-lapse experience permits people in exploring changes to the surface of the Earth like never before from observing the sprouting of Las Vegas strip to the retreat of Alaska’s Columbia glacier. Moreover, it enables user in exploring a variety of compelling locations much further than 1984. For instance, in London, one can make out the progress of the City Airport and the Olympic stadium in Stratford. On zooming in on the Aral Sea it portrays how it has been drying since the 1960s owing to Soviet irrigation programmes.

New Time-lapse Show Sharper View of Planet

If the date is moved to 2007, the volume of the Aral Sea seems to be reduced to about 10% of its original size. For the time being, over the past three decades, Alaska’s Columbia Glacier is observed to have retreated over 12 miles.Another city which has undergone drastic changes since 1984 is New York.

 In comparison to 1984 to 2016, it shows how much progress has been made around the Central Park area and Brooklyn. Programme Manager at Google Earth Engine, Chris Herwig, had commented that `in leveraging the same techniques were used in improving Google Maps and Google Earth back in June, the new Time-lapse showed a sharper view of our planet, with truer colours and less distracting artefacts’.

He further informed that using Google Earth Engine, they had combined over 5,000,000 satellite images, approximately 4 petabytes of data in creating 33 images of the complete planet, one for each year. He added that for the latest update, they had access to more images from the past due to the Landsat Global Archive Consolidation Program together with fresh images from two new satellites, namely Landsat 8 and Sentinel-2.

USGS/NASA - Landsat

The 33 new terapixel global images were then encoded into just over 25,000,000 overlapping multi-resolution video tiles, making interactively explorable by Carnegie Mellon CREATE Lab’s Time Machine library, which is a technology in creating and inspecting zoomable as well as pannable time-lapses over space and time.

In order to explore the feature, one could type in the name of a place in the search bar and move the timeline towards the bottom in opting for the year one would prefer to view. Images had been initially collected as part of constant joint mission between the United States Geological Survey –USGS and NASA known as Landsat.

 The Landsat mission’s satellites since the 1970s had been observing the Earth from space. The images have been sent back to Earth and archived on USGS tape drives which is an achievement that is much stress-free with the present digital technology than with analog tape in the 1970s. In order to make this historic archive of earth imagery available online, Google began working with USGS in 2009.

Wednesday, August 10, 2016

The Mystery of Space Roar


ARCADE – Scientific Instrument Package Sent in Space – Helium Balloon

Early in July 2006, ARCADE – Absolute Radiometer for Cosmology, Astrophysics and Diffuse Emission, a scientific instrument package had been sent in the air through a Helium Balloon. Columbia Scientific Balloon Facility of NASA in Palestine Texas had been the point of launch and had reached an altitude of 120,123 feet at the point one would call `Space. A research scientist at NASA’s Goddard Space Flight Centre, Dr Alan Kogurt who was also the head of the ARCADE team had been looking for unusual Radio Emission which are rather challenging to monitor on ground level due to the increased Radio noise found on the ground.

 The radio emissions stemming from space has been known since the discovery by Nikola Tesla and probably Karl Jansky. It is said that there is a fragment and uniform radio emission which is believed to have been the result of the Big Bang, Cosmic Background Radiation. Dr Kogurt had been hoping to find confirmation of the Cosmic Background as well as a few new radio emission points and what he found was one of those historic `Wow Moments’ in his scientific research. What he has learned is noted in his own words `The universe really threw us a curve’. Instead of the faint signal we hoped to find, here was this booming noise six times louder than anyone had predicted’

NASA Discovered `Space Roar’

NASA found something known as `Space Roar’ which is a sound that is six times louder than anything one could have ever expected. It is a signal that has been discovered by NASA’s ARCADE instrument that is presently without any explanation.

In space no one could hear you scream since there is no medium through which sound can move. Space roar is not actually a sound but it is radio waves. Space roar had first been discovered by ARCADE and has a very fancy name for some very fancy equipment which NASA had attached to a big balloon which was sent into space. ARCADE had intended to look for radio signals from distant galaxies.

Since radio is so commonly used and also utilised in creating auditory signals, it seems easy to overlook that it is just another form of light. It is much less energetic than the visible light where our eyes are not accustomed for it though it tends to behave in the same manner.

Intended to Pick Faint Radio Signals of Distant Stars

A star releasing radio waves is not much different from the sun releasing visible light. Actually to someone far away or far in the future, the sun possibly is emanating primarily radio waves.ARCADE, when sent out in space was intended to pick up the faint radio signals of distant stars. Instead it received strong blare of radio and the input has been described as `boom’ by those who have been researching on it.

After some research done, the idea had been ruled out that it had been just very loud early stars. They also ruled out that it was coming from the dust of our own galaxy and was just a blast of radio, - `space roar’ which seemed to be part of the background noise with no explained reason. Though space roar has vexed the interest of several, there is yet no explanation for the same.

Monday, July 18, 2016

Electromagnetic Aircraft Launch System


The Electromagnetic Aircraft Launch System is said to be a system under development by the United States Navy for launching carrier-based aircraft from aircraft catapult utilising a linear motor drive rather than the conventional steam piston drive.

It utilises computer controlled, solid-state electrics in propelling an armature down a track. The main benefit of the system is that it enables a more graded acceleration encouraging reduced stress on the airframe of the aircraft. Other benefits comprise of lower system weight together with a probable lower cost as well as reduced maintenance needs. Owing to its flexible architecture, the electromagnetic aircraft launch system has the ability to launch a wide variety of aircraft weights and can also be utilised in various platforms with contradictory project structures.

The design is said to include the capability of launching aircraft which tend to be heavier or lighter than the conservative system could accommodate. Besides this, the system needs much less of fresh water thus reducing the requirements of energy-intensive desalination. The age of steam seems to have ended; at least as far as US aircraft carriers are concerned since at Newport News Virginia, the USS Gerald R Ford (CVN 78) had successfully test fired a revolutionary Electromagnetic Aircraft Launch System – EMALS.

Higher Launch Energy/More Reliable/Mechanically Modest

This tends to replace the steam catapults since the 1950s, which had been the standard carrier equipment. The test has made a precise splash since it involved an unmanned dead-weight sledge instead of an aircraft that landed around a hundred yards off the bow of the yet under construction vessel. The so called Ford, the first of her class will be the first US navy ship, carrying the electromagnetic launchers and though it would be utilised on all upcoming carriers it would not be retrofitted to existing vessels.

Being under development for more than 25 years and manufactured by General Atomics, EMALS is said to be the first new carrier project technology in 60 years indevelopmentof real-world application.The EMALS has been designed to substitute the steam-powered launch system which had been the standard on strike carrier for a long period.

As per the Navy, EMALS is said to have the potentials of being utilised by a wide selection of aircraft, is near-silent as well as enjoys smoother acceleration together with a much more consistent launch speed. Moreover it also tends to have higher launch energy, seems to be more reliable, mechanically modest and is easy with regards to maintenance.

Substantial Advantages over Present Launch Systems

EMALS has been tested in the first phase of ACT testing which had ended in 2011 and had included 134 operated launches of aircraft, comprising of the F/A-18E Super Hornet, T-45C Goshawk, C-2A Greyhound, and E-2D Advanced Hawkeye together with the F-35C Lightning II. The second phase had seen the launches of the EA-18G Growler and F/A-18C Hornet. Overall 452 operated launches had been piloted.The substantial advantages over present launch systems provided by EMALS comprise of:

  • Reduced operating and life-cycle cost 
  • Reduced thermal signature 
  • Increased launch working capability for operated, un-operated aircraft 
  • Reduced topside weight 
  • Reduced installed volume

Saturday, April 9, 2016

Mathematicians Find a Peculiar Pattern In Primes


Mathematicians – Discovered Pattern in Primes

Mathematicians have discovered that prime numbers that are divisible only by 1 and themselves tend to hate repeating themselves and prefer not to imitate the final digit of the preceding prime. Robert Lemke Oliver, Stanford University postdoctoral researcher who with Stanford number theorist Kannan Soundararajan had discovered this unusual prime predilection had commented that `it is really bizarre and that they are trying to understand what is at the heart of its.

Primes, generally speaking are considered to behave like random numbers and whenever some kind of order is discovered, it tends to give the mathematicians a pause. Number theorist Barry Mazur of Harvard University has stated that `any regularity one can show regarding primes is appealing since there could creep around some new structure. Exposing some kind of architecture where it was presumed that there are none could lead to inroads in the structure of the mathematics’.

When primes tend to get in the double digits, they should end either in a 1, 3, 7 or 9 and mathematicians are aware that there are around the same number of primes ending with each digit. Each seems to appear as the last number about 25% of the time.

Bias in Order Where Final Digits Appear

In arithmetic progressions, the prime number theorem proved this distribution around 100 years back and the yet unresolved Riemann hypothesis forecasts that the rates tend to rapidly reach 25%. This property is tested for millions of primes according to Soundararajan. Mathematicians, without any reason to think then, have presumed that the distribution of those final digits had been basically unplanned.

Considering a prime which ends in 1, the odds that the following ends in 1, 3, 7 or 9 must be approximately equal. Soundararajan has commented that if there is no interaction among primes, it is what one would expect, though something funny tends to happen’. Inspite of individual final digit seems to appear somewhat the same amount of time there is a bias in the order wherein these final digits seem to appear. Prime which tends to end in 7 for instance is quite less likely to be followed by a prime which also ends in 7 than a prime which ends in 9, 3 or 1.

Anti-Sameness Bias

Andrew Granville, a number theorist at the University of Montreal and University College London had stated that the discovery of the final digit bias had no conceivable practical use and the point is the wonder of the discovery. The irregular pattern had been previously observed by two separate teams of researchers though the Stanford duo seems to be the first to clear a mathematical explanation for the pattern that was posted online on March 11 at Granville who calls the work rigorous, refined and delicate, had informed that when the numbers were crunched by the researchers, based on the hypotheses, they had predicted that it fitted the results strangely.

One would contemplate that this `anti-sameness’ bias tends to follow naturally from the order of numbers and 67 is followed by 71 and is followed by 73. However, this explanation does not suit the data according to Lemka Oliver who checked the computer calculations out to 400 billion primes. He says that the `bias is way too large and is not equal for the non-repeating final digits.

So between the first hundred million primes for instance, a prime which tends to end in 3 is followed by a prime which ends in 9 about 7.5 million times while it is followed by a prime which ends in 1 about a million times. The last 3 is followed by a final 3, a mere 4.4 million times.

Saturday, August 22, 2015

A Warp Drive


Warp Drive – Hypothetical FTL

Warp Drive is a hypothetical faster than light FTL, an impulsion system which helps one to travel several times faster than light on manipulating the fabric of time as well as space It is something which enabled the Starship Enterprise to courageously go where no man has ever gone before.

A spacecraft equipped with warp drive could travel at apparent speeds which can be greater than that of light by several orders of magnitude. When compared to other hypothetical FTL technologies like jump drive or hyper drive, warp drive does not seem toallow instant travel between two points but comprises of a measurable passage of time.

Spacecraft at warp velocity tend to interact with objects in normal space. A NASA research team, in 2012 had revealed that they are operating on a Star Trek style warp drive. They in fact have several ideas already as to how they would eventually be capable of making warp drives to function.

The most projecting possibility presently depends on a 1994 theory by Miguel Alcubierre wherein the idea is based on `expanding’ the fabric of space beyond the ship and retracting the space ahead that would make the ship move forward as if it were riding a huge conveyor belt.

Several Images of Interstellar Travel

Since sound barrier had been broken, many have turned their focus on how to break the light speed barrier. However, Warp Drive or any other term with regards to faster-than-light travel seems to stay at a level of assumption. Majority of scientific knowledge determines that it is impossible while considering Einstein’s Theory of Relativity and there are some reliable concepts in scientific literature though it is too early to know if they are practical.

Several writers of science fiction have given many images of interstellar travel; though travelling at the speed of light is presently only imaginary. Meanwhile, science has been making headway and while NASA is not trailing interstellar flight, scientists have continued to advance ion propulsion for missions to deep space and beyond it, utilising solar electric power. This method of propulsion seems to be the fastest and the most efficient as of date.

System Depends on Electromagnetic Drive/EMDrive

NASA is said to have been silently testing an innovative new method of space travel which would someday enable humans to travel at speeds much faster than light. Researchers are of the belief that the new drive would be carrying passengers together with their equipment to the moon in as little as four hours.

A trip to Alpha Centauri which would take tens of thousands of years would now be reached in just 100 years. The system depends on electromagnetic drive or EMDrive that converts electrical energy into thrust without the use of any rocket fuel.

Over the years of scientific theory, several strange theories which have become reality have come up. To know more on the theories of interstellar flight, one could visit the Tau Zero Foundation. A former NASA Glenn physicist, Marc Millis, had founded the organization to consider revolutionary advancements in propulsion. Warp drive which enables humans to go around other galaxies may now no longer be purely in the dominion of science fiction.

Saturday, December 13, 2014

Tesla Death Ray

The Death Ray/Death Beam – Theoretical Particle Beam 

Tesla Death Ray
The death ray or death beam, a theoretical particle beam or electromagnetic weapon during the 1920s till the 1930s, was claimed to have been independently invented by Edwin R. Scott, Graichen, Guglielmo Marcone, Nikola Tesla and Harry Grindell Matthews together with several others.

The National Inventors Council, in 1957 were issuing list of needed military inventions which included a death ray. Based on fiction and inspired by past speculation, research into energy based weapons, gave rise to the contribution of real life weapons in use by modern militaries which was at times called a kind of `death ray’, such as the United States Navy and its `Laser Weapon System’ –LaWS, which were deployed in mid-2014 and these armaments were technically known as - directed energy weapons.

Tesla inherited a deep hatred of war from his father and all through his life he thought of a technological way to put an end to warfare. He was of the opinion that war could be converted into a `mere spectacle of machines’.

He informed reporters at a press conference in 1931 that he was on the way of discovering a new source of energy and when asked to explain the nature of the power, he informed that `the idea first came up as a tremendous shock and at that point of time he could only state that it would come from an entirely new and unsuspected source’.

Death Beam/Teleforce – Invented in 1930

Nikola Tesla is said to have invented in the 1930s, a `death beam’, which was called teleforce. He explained that `this invention does not contemplate the use of the so-called `death rays’ and the rays are not applicable since they cannot be produced in required quantities which tends to diminish quickly in intensity with distance.

All energy of New York City transformed into rays projecting twenty miles could not kill a human being since according to the well-known law of physics; it could disperse to an extent so as to be ineffectual. The apparatus projects particles which could be comparatively large or of microscopic dimensions with provision to convey to a small area at great distance trillions of times additional energy was possible with rays of any type.

Thus thousands of horsepower can be transmitted by stream which could be thinner than hair so that nothing can resist’.He proposed that a nation could destroy any object approaching within 200 miles and could provide a wall of power to make any country whether big or small, impregnable against armies, airplanes or any other means of attack.

Particle Accelerator 

The mechanism of Tesla’s death ray is not comprehended well and was apparently a kind of particle accelerator. According to Tesla it was an outgrowth of his magnifying transformer with its focus on energy output in a thin beam was so concentrated, that it could not scatter even over large distances.

Tesla promoted this device purely as a defensive weapon with intentions to knock down incoming attacks, making it the greatest grandfather of the Strategic Defence Initiative. Though the concept of a death ray was never put into action, it somehow fuelled science fiction stories which lead to the science fiction concept of the hand held ray gun that are used by fictional heroes like Flash Gordon. Other similar weapons are also found in George Lucas’ science fiction saga like `Star Wars’.

Wednesday, December 3, 2014

Bernoulli Space

Daniel Bernoulli
The Bernoulli Space – A Mathematical Concept 

The Bernoulli space, is a mathematical concept for the transition from past to future with due consideration of uncertainty of the future developments. It is at the main focus of Bernoulli stochastic which represents the basis for reliable as well the exact predictions together with measurements.

The Bernoulli space does not seem to presume an ideal world, for instance, the example of physics on the basis of belief in truth and causality where in contrast, the Bernoulli space acknowledges human ignorance and cosmological pattern which could generate uncertainty and as a mathematical model of change, could be regarded as a means of obtaining accurate as well as reliable predictions.

The term `uncertainty’, before the introduction of the model, should be explained since it could have several inconsistent meanings in the daily speech which in everyday speech, uncertainty could be referred to determinate, though unknown facts or to the indeterminate future which is not known since it does not exist so far. The difference is obvious between the facts and future events, a fact which is specified and if considered has a fixed value.The future event on the other hand is indeterminate and hence may or may not take place.

Ignorance Related to True Value – Unknown 

Fact may be unknown though it is fixed while a future even may not be known or unknown since it is subject to randomness which could or not occur. From the point of unique interpretation, uncertainty is explained as the inability to predict the future development accurately and this inability is because of the ignorance related to the past as well as the randomness regarding the future.

The two elements of ignorance and randomness generate uncertainty and while describing uncertainty in a quantitative manner, both the sources need to be taken into consideration. Ignorance relates to the true value which is unknown it is only rarely that the true values of considered attributes are known though in general they are unknown.

Values are what is known but are not the true value and with the knowledge on `what is not’, it is possible to state a set of values which are true though may be unknown. This reflects the knowledge on `what is no’, or the ignorance of `what is’ and the larger the set, the larger would be the ignorance.

Randomness – Future Episodes

Randomness on the other hand refers to future episodes and reflects a natural tendency to take place and could be small or large based on the given situation. Over 300 years back, Jacob Bernoulli, the Swiss theologian as well as a mathematician was successful in quantifying randomness of a future event.

He related the occurrence of natural tendency of a future event with the help of degree of certainty of its occurrence, calling this degree probability and if the occurrence is impossible in a given event, there is no propensity with the degree of certainty, which is the probability that would be 0.

While on the other hand if the occurrence is certain in a given event, the propensity would be at the maximum with the degree of certainty being the value of one.

Saturday, November 29, 2014

Minimax Decision Theory

Minimax – Decision Rule 

Minimax is referred to a decision rule which is used in decision theory, statistics, game theory and philosophy, in reducing the possible loss for a worst case scenario. It is a principle for decision making when presented with two various and conflicting strategies with the use of logic and determine using the strategy which will minimize the maximum loss that may occur.

Formulated earlier for two players zero sum game theory covers both cases wherein players make alternate moves. Those that make simultaneous moves have been extended to more complex games with general decision in the presence of uncertainty. In the case of theory of simultaneous games, a minimax strategy is considered to be a mixed strategy that is part of the solution to a zero sum and in zero sum games; the minimax solution is the similar to the Nash equilibrium.

There is a minimax algorithm for game solutions in combinatorial game theory. With games like tic-tac-toe, simple version of the minimax algorithm is used where individual player can win, lose or end up in a draw. Minimax algorithm is considered to be a recursive algorithm in choosing the next move in a game which usually has two players participating in a game.

Combinatorial Game Theory

It is associated with a value for each position or state of the game and the value is computed by a position evaluation function, indicating how good it would be for a player to reach a particular position. The player on his part tends to make the maximum moves thereby minimizing the value of the position from the opponent’s following moves.

 An allocation method comprises of assigning a certain win for one play as +1 and for the other player as -1 which leads to combinatorial game theory developed by John Horton Conway. Using a rule is an alternative if the result of a move is an immediate win for one player, it is considered positive infinity and if it is an immediate win for the other player, then it is a negative infinity. Value of the first player of any other move is the minimum values resulting from each of the second players possible moves. Due to this the first player is called the maximizing player while the second player is called the minimizing player; hence it is given the name minimax algorithm.

Heuristic Evaluation Function 

This algorithm assigns a value of positive as well as negative infinity to any position since the value of every position will be the value of some final winning or losing position. Generally this is often possible at the end of a complicated game like chess because it is not feasible to know about the result before the completion of the game which could lead to one of the player winning.

We could extend this if we supply a heuristic evaluation function which gives values to non final game status without taking into account all possible following complete sequences and then limit the minimax algorithm to note only a certain number of moves coming up. The number is known the `look-ahead’, which is measured in `plies’. For instance, the chess computer `Deep Blue’ looked forward to at least 12 plies and then applied a heuristic evaluation function.

Tuesday, November 25, 2014

Ball Lightning

Ball Lightning
Ball Lightning – Rare Weather Phenomenon

A rare weather phenomenon known as ball lightning has been observed for the first time in nature by Chinese scientists. Ball lightning is an unexplained electrical phenomenon which occurs during thunderstorm, is unpredictable and is the reason why the researchers do not know much regarding it. It tends to last for more than a second that is considered long lived for lightning but hard to capture and study.

Some reports state that the ball would explode sometimes with fatal consequences and would leave behind the odour of sulphur. Several scientific hypotheses have been proposed down the centuries about the ball lightning. A report published in The Journal of Physical Chemistry, last year, has made researchers at the U.S. Air Force Academy in Colorado, to figure out how to reproduce ball lightning in the lab. Electrodes were used partially submerged in electrolyte solution in order to create high power electric sparks which resulted in bright white plasmoid balls.

According to layman’s term, ball lightning seems to be a big flash of light which looks circular and appears during a storm in the sky. Sometimes it tends to have a blue glow and emanates from objects like lightning rods or ship masts.

Ball Lightning 1
Few Scientific Data – Infrequency/Unpredictability

Till 1960, several scientists debated that ball lightning was not a real phenomenon inspite of numerous sighting throughout the world and laboratory experiments could produce effects which seemed to be visually similar to the reports of ball lightning.

 Many are speculating whether these were related to the natural phenomenon. The scientific data with regards to ball lightning are few due to its infrequency as well as unpredictability. Its existence is presumed on basis of reported public sightings which has produced some inconsistent findings. Based on inconsistencies and the lack of reliable data, the true nature of ball lightning is yet unknown.

The first optical spectrum which appears to be a ball lightning event was published and included a video at high frame rate in January 2014.According to historical account it is considered that the ball lightning to be the source of the legends which describe luminous balls like the Mapuche Anchimayen of mythology and some have claimed that it is harmless and small while others state that it is dangerous and large.

Ball Lightning 1
Mike Lindsay/US Air Force Academy
Various Theories

A study done in 1960 shows that 5% of the population of the Earth witnessed ball lightning while another study analysed reports of around 10,000 cases. One such description was reported on 21st October 1638, during the Great Thunderstorm near a church in Widecombe-in-the-Moor, Devon, England, where four people died with around 60 injured during a severe storm when an 8 foot ball of fire struck, entered the church and destroyed it.

The huge stones from the walls of the church were hurled to the ground and through the large wooden beams and the ball of fire smashed the pews as well as several windows filling the church with the odour of sulphur and dark thick smoke. It was reported that the ball of fire had been divided into two parts with one exiting from the window by smashing it open while the other disappeared in some area in the church.

The conclusion at the time was that the ball of fire was the `devil’ or the `flames of hell’ and the fire had sulphur smell. Others blamed it on two people who had been playing cards in the pew during the service which aroused God’s wrath.

Thursday, June 12, 2014

The Monty Hall Problem – A Brain Teaser

The Monty Hall Problem
The Monty Hall Problem which is a brain teaser got its name from the TV game show known as `Let Make a Deal’, which was hosted by Monty Hall and has fascinated mathematician with the possibilities presented by the `Three Doors’.

This has led to the mathematical urban legend surrounding the `Monty Hall Problem’. The problem was first posted in a letter to the American Statistician to Steve Selvin in 1975 and it became a popular quest from a reader letter quoted in Marilyn vos Savant’s, `Ask Marilyn’, column in Parade magazine in the year 1990.

The scenario is such that one is given the choice to choose one closed door out of the three doors placed before them with one having a prize of hidden car behind it while the other two have goats behind them.The contestant is not aware of where the car is, but Monty Hall knows where the prize lies. The contestant selects a door and Monty opens one of the remaining doors, the one he knows does not have the car behind it.

 If the contestant has already chosen the correct door, then Monty is likely to open either of the two remaining door revealing that it does not contain any prize and goes on to ask the player if he would like to switch the selection to another unopened door or stay with the original selection. The player here is at a crossroad in considering his choice on the selection of the door, wondering on the probability of winning the car if they happen to stick to their original choice and what would be the chances of switching selection.

The Monty Hall Problem Difficult to Grasp

The Monty Hall Problem 1
The Monty Hall problem is difficult to grasp and unless the player thinks carefully about the deal, the role of the host goes unappreciated. The Monty Hall problem has drawn much academic interest from the surprising result with its simple formulation.

Variations has also been made by changing the implied assumptions which has created drastic different consequences where for one variation, if Monty offer the contestant a chance to switch while he had initially chosen the door with the prize of a car behind it, then the contestant should not think of switching on options. For another variation, if another door is randomly opened and happens to reveal a goat, then it does not make any difference to the contestant.

The problem is a paradox like the veridical types since the correct result is counter intuitive and seems absurd but true. This problem is closely related to the earlier Three prisoner’s problem as well as the older problem known as Bertrand’s box paradox.

Switching can turn Loss into Win/Win into Loss

According to Jason Rosenhouse, James Madison University mathematics professor, who has written an entire book on the subject – The Monty Hall Problem – The Remarkable Story of Math’s Most Contentious Brainteaser, states that the contestant can double their chance of winning on switching doors when the three conditions are fulfilled.

In the first place, Monty never opens the door which the contestant selects, secondly he always opens a door concealing a goat and thirdly when the first two rules leave Monty with a choice of doors to open, he makes his choice at random. Switching can turn a loss into a win and a win into a loss according to Rosenhouse.

Saturday, June 7, 2014

Gravitational Waves – Ripples in Space-time Continuum

Gravitational Waves – Ripples in Space-time Continuum

Gravitational waves are ripples in space time continuum which was seen by Albert Einstein in his general theory of relativity. These are ripples that carry energy across the universe and were predicted to exist as a consequence of Einstein General Theory of Relativity in 1916.

Though there is a strong evidence of its existence, gravitational waves has not been directly detected earlier since they are minuscule which is a million times smaller than an atom.

Ripples in Space-time Continuum

They seem like tiny waves on a lake far away and the lake’s surface looks glassy smooth and the details of the surface can be seen only when very close to it.Einstein, in the year 1916, discovered a mathematical way to explain gravity and called it his general theory of relativity.

This theory relied on a set of coordinates which described the combination of space and time known as the space time continuum.

Warping of Matter and Energy – Force of Gravity

Since matter and energy warp the space time continuum like heavy weight on a mattress, the warping thus creates the force of gravity and gravitational waves are ripples in the space time continuum.

General relativity shows us how gravity can affect time which should be taken into account by satnav to inform where you are. A telescope at the South Pole known as Bicep – Background Imaging of Cosmic Extragalactic Polarisation, has been looking out for evidence of gravitational waves by detecting subtle property of the cosmic microwave background radiation.

The radiation which was produced in the big bang was originally discovered by American scientist in the year 1964 with the help of radio telescope and has been called as the `echo’, of the big bang. The large scale polarisation of this microwave had been measured by Bicep. Primordial gravitational waves only can imprint such pattern if they have been amplified by inflation.

Curvature of Space-time 

Curvature of Space- Time
According to Einstein’s theory of general relativity, gravity has been treated as a phenomenon, the outcome of the curvature of space time and this curvature is the result of the presence of mass. The more mass that is contained within a volume of space, the greater is the curvature of space-time at the boundary of the volume.

When the objects with mass move around in space-time, the curvature also changes to reflect the changed location of these objects.

In some situations, accelerating objects may generate changes in this curvature propagating outwards at the speed of light like a wave and these propagating situations are known to be gravitational waves.

Effects of Passing Gravitational Wave

Effects of Passing Gravitational Wave
As these gravitational wave passes far away, an observer will find space-time distorted due to the effects of strain and the distance between free objects may increase and decrease rhythmically as the wave may pass at a frequency which corresponds to the wave despite free objects not being subjected to an unbalanced force.

 The effect of the magnitude decreases inversely with distance from the source. Effects of passing gravitational wave can be seen by imagining a flat region of space-time with that of a group of motionless test particle lying in a plane and as the gravitational wave passes through the particles along a line which is perpendicular to the plane of the particles, the particles tend to follow the distortion in space-time.

Tuesday, April 22, 2014

Heat Death Of Universe

Heat Death Of  Universe
Living among Glowing Filaments of Energy

The universe differs in composition in various places and we live among glowing filaments of energy and matter which are collected together in a background of nothingness and the planetary systems, stars, nebulae, black holes, galaxies are all concentrated specks in space.

In the distant future, these hot heavy specks may tend to spread out into cold void mixing till everything will change in a thin mist and like boiling water added to cold content; the two extremes will balance out and thereby leave it lukewarm.

 Heat death began from the work of several physicist who started the study of comprehending how the machines transformed heat to mechanical work and formed an empirical conception on how steam engines and other suppliers of force did this which led them to the understanding of the entire system settling down to an intermediate temperature with no more energy transfer taking place.

This is known as the maximization of entropy. The atoms, subatomic and molecules particles tend to collide in space spreading the energy and momentum from fast to slow wherein the motion of these particles of the universe gradually become random and they collide and interact with no change of energy, finally going out into the empty space.

Universe Reaches Maximum Entropy

The heat death of the universe will take place when the universe reaches a state of maximum entropy when all the available energy like the hot source will have moved to places of less energy as a colder source and when this takes place no additional work will be possible from the universe.

 Since heat will cease to flow, no more work would be acquired from heat transfer while the same sort of equilibrium would also be expected from other forms of energy namely mechanical, electrical etc.; resulting it to be effectively dead especially for the use of humankind. The heat death originated from the second law of thermodynamic which is the entropy that increases in an isolated system being the universe.

 Entropy that can be arranged in a number ways where a system can be arranged should not decrease, but evolve to maximum disorder or thermodynamic equilibrium. When this takes place, it will be divided equally throughout the cosmos leaving no space for any reusable heat or energy to break into existence resulting in the processes which consume energy would cease to exist, including our planet Earth.

Big Freeze/Big Chill

The theory on how the universe will tend to end, the heat death which is also known as the big Freeze or the big chill has been suggested as one of the ways where the cosmos would come to an end especially when it’s expanding.

 The heat death of the universe is considered as the ultimate fate of the universe wherein the universe will tend to diminish to a state of no thermodynamic free energy and will no longer be capable to sustain processes which will consume energy. It does not imply any temperature but needs temperature difference or other processes that may no longer be needed for performance of work.

Sunday, April 13, 2014

Wow Signal – Brief Burst of Radio Waves

Wow Signal
The Search for Extra Terrestrial Intelligence, SETI, has seen many astronomers touring the sky with hopes of receiving artificially generated radio signals sent by alien civilization. Nicknamed the `Wow’, signal, saw a brief burst of radio waves detected by astronomer Jerry Ehman, on August 15, 1977, a strong narrowband radio signal, while he was working on a SETI project at the Big Ear radio telescope, of the Ohio State University which was then located at Ohio Wesleyan University’s Perkins Observatory in Delaware, Ohio.

The signal was so remarkable that Ehman circled it on the computer printout with a `Wow’, in the margin and unintentionally gave the received radio signal the name for which it became very famous. The signal had the unexpected hallmarks of non terrestrial and non solar system origin and it lasted for the full seventy two second window that Big Ear was ever able to observe and has not been detected again. This signal has been the subject of significant media attention ever since.

No Identification for Signal’s Source

No identification has been found for the signal’s source inspite of great efforts without repeated signal found which turns out to be a mystery. The only conclusion drawn was that if the signal could have originated in deep space then it could either be an astrophysical phenomenon which was never seen before or it could have been an intercepted alien signal.

After trying and failure in finding any repeat of the signal, the Wow signal researchers faced difficulty and Ehman became skeptical of its origin stating that `something suggest it was an Earth sourced signal that simply got reflected off a piece of space debris’, and when he made attempts to investigate the explanation, he faced more problems.

Signal’s Intensity

Wow Signal  Location
On investigations, it was found that the signal could have probably originated on Earth with likely reflection off a piece of space junk. The received signal being specific, the explanations required too many assumptions and a logical thinking pattern known as Occam’s razor pointed towards this signal with an astrophysical origin though it did not provide an explanation.

The signal’s intensity at the start was noticed to rise and fall over a period of 72 seconds, consistent with the Earth’s rotation with a single source tracking in the sky with the help of the Big Ear telescope which gave the signal a characteristic signature caused by object seen in the sky. This would be impossible to match for any Earth bound object.

Amazing Frequency – Sharp Transmission

The signal also stood out over the background noise found in deep space, about 30 times louder than anything else and the most amazing thing about it was its frequency which was sharp transmitting at only a single frequency.

Natural radio sources do not operate that way but spread across a range of frequencies, in other words, the same signal covers a broad band of transmission and the Wow signal is far from it showing only one specific frequency of 1420 MHz approximately which is also known as the hydrogen line, a frequency which is internationally banned from use by terrestrial radio signals due to its use in radio astronomy.

Thursday, April 10, 2014

Angel Hair – Connected to UFO

Angel Hair
Angel hair or siliceous cotton is a fibrous, sticky substance which had been reported in connection with UFO sightings. It has been named for its similarity to fine hair or spider webs and in some instances it is found to be the web threads of migrating spiders where it is reported that angel hair evaporates or disintegrates within a short span of time of forming.

 It is an important aspect of Raelism, UFO religion and one of its theories is that it is created from ionized air sleeting off an electromagnetic field surrounding a UFO. This cobweb and jellylike substance slightly radioactive often falls to the ground after UFO sightings, has been dubbed `angle hair’ which when held in the hand that seems like cottony tufts have an offensive smell.

 American ufologist refers it as angel hair, while Italian considers it as siliceous cotton and the French call the term `the Madonna’s present, describing the semitransparent threads that fell from heavens.

Phenomenon of Shining Flying Spindles started in 1954

Angel Hair 1
Discussions of the phenomenon first started in 1954 when two men known as Gennaro Lucetti and Pietro Lastrucci stood on a hotel balcony in the vicinity of St. Mark’s Square of Venice on October 27, 1954 when suddenly they saw two shining spindles flying across the sky.

These objects left behind a fiery white trail while they moved along and flew at high speed each at some distance away from the other taking a U turn to fly away in the direction of Florence. It was also reported that an unexpected interruption in a soccer game played in one of the stadiums of Florence took place when the players, referees along with 10 thousand spectators stood gazing into the skies at the two objects making their rounds over the stadium with a couple of unidentified objects flying over the city thrice from 14.20 to 1429 and a number of strange cobweb like threads dropping to the arena as these objects disappeared.

Fibrous Material Highly Resistant to Tension 

Since the substance would disintegrate when held in hand, a student called Alfrede Jacopozzi was the only one to have picked up a few threads and seal them in a hermetic test tube and handed over the tube to Professor Giovanni Canneri, the director of the Chemical Analysis Institute, University of Florence. His colleague, Professor Danilo Cozzi, carried out a series of test on the mysterious substance and his conclusion was that it was a fibrous material which was highly resistant to tension and torsion and once subjected to heat, the material would tend to grow dark and evaporate leaving transparent sediments which melted away. These sediments were found to contain boron, silicon and magnesium which hypothetically speaking could be some kind of boron-silicon glass according to Professor Cozzi.

Comprehensive Analysis Conducted

Charles Maney, an American ufologist suggested that the material was the UFO excess energy which materialized and according to him the threads returned to their dimension or some other space time continuum while fading away.

 Another British ufologist suggested that the angel hair was a variety of ectoplasm that emanated during a spiritualistic session. A Soviet researcher, B.V.Lyapunov, who had done much contribution to popularize science, received sample of angel hair from New Zealand in 1967 in a tightly sealed tube which contained some unknown stuff measuring less than one tenth of a cubic centimeter. A comprehensive analysis was then conducted by a team of scientist on the substance.

Physicist, L.V. Kirichenko, specialist in radiometry was of the opinion that the substance was a fine fibered material where some of its fibers were less than 0.1 micron in diameter and most of the fibers were tangled in the bundles or separate threads measuring 20 microns in diameter looking whitish and semitransparent. Analogues are unknown to the analyzed substance. Academician I.V. Petryanov Sokolov on summing up the study of the material stated that the sample is of considerable interest as a material with extremely fine fibers and it is unlikely that the material was formed by nature.

Flying Web Type Substance

The entire substance unfortunately was used during the research and no fresh samples of angel hair were obtained inspite of repeated reports in the country. British Society for UFO Studies reported in August 1998, that mysterious cobwebs fell to the ground shortly after an UFO sighting in North Wales and a sixty old woman and her daughter-in-law sighted about 20 silver balls in the sky before taking note of cobweb like material which fell to the ground.

In 1898, residents of the city of Montgomery in United States reported angel hair which fell from the clear blue sky stating the fall as `flying web type substance’ and according to eyewitness, the threads of the material resembled fluorescent asbestos fibers. A large number of sticky fibers were seen falling from the sky for two hours on February 10, 1978, in the vicinity of the coastal city of Samaru, New Zealand where the fibers appeared to be quite finer than cobwebs though clearly visible against a clear blue sky.

Monday, March 31, 2014

Galactic Center Alignment Of Milky Way

Galactic Alignment between the Sun and the Center of Galaxy

Galactic Center Alignment Of Milky Way 3
Our planet is connected to our solar system and our solar system is connected to our galaxy which in turn is connected to the universe, the living and breathing organism of life. The earth orbits round the Sun, which itself is a part of the Milky Way galaxy taking about 220 million years for the Sun to complete a single journey around the Milky Way. The Sun moves up and down during its travel in orbit around the center of the galaxy, its oscillation taking a total of 64 million years to complete. When the Sun passes directly through the galactic disk, there is a perfect galactic alignment between the Sun and the center of the galaxy. The Milky Way is 100,000 years away and 1,000 light years thick and during the course of those 64 million years cycle, the Sun tends to rise above the galactic plane, 500 light years moving down through the galactic plane, till its 500 light years below and then returns back again.

Galactic Equator – Milky Way

Galactic Center Alignment Of Milky Way 1
The Galactic Alignment is the alignment of the December solstice sun with the Galactic equator of our galaxy, the Milky Way which occurs as a result of the precession of the equinoxes and is associated with the end of the ancient Mayan calendar in 2012. Precession is the result of the earth moving very slowly on its axis shifting the position of the equinoxes and solstices by a degree every 71.5 years due to which the sun is one half of a degree wide which will take the December solstice sun, 36 years to precess through the Galactic equator. The Galactic Equator, a virtual line that describes 0 degrees longitude and 0 degree latitude acts as a divisionary line between the northern and southern hemisphere of the Milky Way galaxy and the precise alignment of the solstice point with the Galactic equator was calculated to occur in 1998. This date was refined further by Smelyakov to May 7, 1998.

Galactic Alignment once every 25,000 years 

Galactic Center Alignment Of Milky Way 2
Due to the width of the Sun being the cause of an inprecise alignment of visual alignment, the Galactic Alignment `zone’ is 1998 +/- 18 years = 1980 – 2016, according to Jenkins – era 2012. The Galactic Alignment takes place only once every 25,000 years and occurs to what the ancient Maya pointed, with the 2012 end date of their Long Count calendar. Some, who claimed that the Maya predicted the end of the world in 2012, used a particular astronomical alignment into believing that this would take place at the end of the 13 Bak’tun in the Maya Long Count calendar – December 21st 2012. The claim was that the Sun on that date would align with the center of the Milky Way Galaxy, a thing which happens once in every 25,772 years. It was further claimed that the Maya knew about the alignment and set their Long Count calendar to end on that day since that alignment would be leading to something to take place which was left to the imagination and would affect mankind involving every level of dimension as well as every planet in our solar system.

Wednesday, February 5, 2014

Science Mystery: WORMHOLE, A Means To Travel Back In Time

Science is filled with fiction on tales of traveling through wormholes and the reality of this type of travel is somewhat complicating though we are yet to identify one. A theoretical passage on long journeys in space across the universe with shortcuts created, known as wormholes are, predicted by the theory of general relativity, though they tend to bring about high radiation, sudden collapse together with dangerous contact with exotic matter. Wormholes comprises of two openings which are spherical with a throat connecting the two.

The throat could be a straight stretch and it could also wind around taking a longer route than a conventional one. Physicists like Nathan Rosen and Einstein used the theory of general relativity in 1935, to introduce the existence of bridges through space time and the wormhole is known as an Einstein Rosen bridge which is considered a hypothetical topological feature of space-time with shortcuts through space time. These paths known as wormholes or Einstein Rosen bridges form a connection between two different points in space time creating shortcuts which can reduce the distance and time.

A simple version of the wormhole is to visualize space as a two dimensional surface of a tube which connects various points of the surface where the mouths of a wormhole are analogous to the holes at both ends of a two dimensional surface.

WormHole -2
No observational evidence has been found for wormhole though the equation of general relativity has resulted in valid solutions containing wormholes. Due to its theoretical strength, it is one of the great metaphors in general relativity. And the first kind of wormhole discovered was the Schwarzschild wormhole. This is present in Schwarzschild metric which explains an eternal black hole though it was observed that this particular type of wormhole collapsed quickly for anything to cross from one point to the other. While wormholes which could be crossed from both ends known as traversable wormholes would only be possible, if exotic matter combined with negative energy density was used to stabilize them.

When the wormhole contains adequate exotic matter either naturally or artificially, it could be used as a method of sending travelers or information through space. Adding exotic matter to wormhole to some extent might stabilize the same to a point where travelers there might be a possibility of the addition of regular matter which would be sufficient to destabilize the portal. Besides connecting two separate regions within the universe, wormholes could also connect two different universes and some scientists presume that if one mouth of a wormhole is shifted in a specific manner, it could be used for time travel though there is a controversy according to British cosmologist Stephen Hawking that this argument is not possible.
Present time technology is insufficient to stabilize or enlarge wormholes if they could ever be found though scientist are still debating on the possibility of exploring the concept in obtaining a method of space travel where technology could be utilized.

Without directly observing a wormhole, scientists can only wonder on how they could operate. Some are of the opinion that a black hole, at a point in space time where gravity is the strongest, with no possibility of light could serve as an entrance and this opening could be connected to a tube which would empty from a white hole from which point matter and light would exit. Chances are less that this type of wormhole can be used for space travel where the passages are likely to prevail only on small subatomic scale and if large enough to traverse, the gravitational forces would enable them to collapse at an instant, on opening.

WormHole -3
Scientist are of the belief that wormholes which are made of hypothetical exotic matter could contain negative energy to stay stable with the addition of outside object such as a spaceship, would result in the tunnel falling apart. The Einstein Rosen wormhole would not be considered useful for travel since they collapse quickly though recent research indicate that a wormhole containing exotic matter can stay open without changing for a long period of time.

The theory of Einstein on general relativity, mathematically relates the presence of wormholes though none have been discovered till date. A case of negative mass wormhole could have been discovered whose gravity is so intense that even light cannot escape. Certain solutions of general relativity could be considered for the existence of wormholes wherein the mouth is a black hole though a naturally formed black hole occurred during the collapse of a dying star could not be considered for the creation of a wormhole.

Since it cannot be seen, NASA scientists focused on the tiny core of the galaxy M87 instead which was a super massive cosmic engine, 50 million light years from the earth and astronomers indicated that the core of M87 contained ferocious swirling maelstrom of superhot hydrogen gas which spun at 1.2 million miles per hour. In order to keep this disk of spinning gas from flying violently apart in various directions, a colossal mass concentrated at the center, weighing around 2 to 3 billion sun was essential. According to Davis’ publication in July in the American Institute of Aeronautics and Astronautics journal, reference is made to time machines with the possibility that wormhole could become or could be used as a means to travel back into time.

Moreover he relates that scientist’s understanding of the laws of physics are connected with time machine where there are numerous space time geometry solutions to portray time travel or have properties of time machines. A wormhole would probably permit a ship to travel from one point to another much quicker than the speed of light since the ship would arrive at its destination faster than a beam of light by taking a shortcut through space time through the wormhole. Thus the vehicle does not actually break the universal speed limit, the speed of light where in fact the ship never really travels at a speed faster than light. A wormhole could be utilized to cut through space as well as through time.

Sunday, January 5, 2014

Science Mystery: Insight on the Butterfly Effect

Butterfly Effect-1

The term Butterfly effect is a concept invented by the American meteorologist, Edward Lorenz to highlight the possibility of small changes causing momentous effects; in simple term it was used in chaos theory to describe how small changes to unrelated thing or conditions could affect large complex systems. This term is derived from a suggestion in South America, that the flapping of the wings of a butterfly could affect the weather in Texas showing that the tiniest influence on a single part of a system could have an impact on another part. The butterfly effect occur when small event have an exceeding far reaching and a large impact. This term is used because its wings though fragile do not stir much air as they flap but its minute movement does initiate a series of changes which grow eventually causing a large storm thousands of miles away. It implies that large events may be connected to small or even to the minuscule occurrences. The Butterfly effect in other words is a way of describing the concept that unless all factors are accounted for, major systems like the weather tends to become impossible to predict with accuracy due to various unknown variables.

Butterfly Effect-2
This concept of the butterfly was attributed to Edward N. Lorenz, a meteorologist as well as a mathematician who was the first proponents of chaos theory. He studied the concept mathematically and also drew the attention of other meteorologists. He saw meaningful pattern in what seemed to be random events in weather patterns and these ideas contributed to the new science of chaos. One day as he was running global climate model on his computer with the hope to save him some time, ran one model from the middle rather than from the beginning. The process of the two weather predictions, one based on the whole process inclusive of the initial conditions and the other based on a portion of the data with the process which was already half way, diverged drastically. Lorenz was of the belief that the computer models would be identical regardless of where it had started but realized that the two models differed in tiny and unpredictable variations. It was initially used to explain why weather forecast were often inaccurate and initial conditions at times would go unnoticed or were overlooked and these minute conditions were the cause of hurricanes or similar changes in the weather.

This insight regarding the large impact of minor occurrences led to the butterfly effect in many other fields including psychology and explained why predictions at times remained inaccurate. Hence recognizing the importance of initial conditions can bring about improvement in accuracy of scientific predictions. Several biologist, epidemiologist, physicists, ecologist as well as psychologists presently consider the butterfly effect, nonlinear reasoning and chaos while making certain predictions which have proved to be useful in various social and behavioral sciences as well as physical and biological sciences. The butterfly effect is used to explain various unpredictable behaviors and thinking patterns which though may appear to be meaningless can be understood as a result of nonlinear reasoning. This could lead to an understanding of the creative original insights by taking initial perceptive into account and permitting nonlinear cognitive processes leading to valuable insight or solutions to a given problem.