Search This Blog

Thursday, March 19, 2015

Numeracy


From Wikipedia, the free encyclopedia


Children in Laos have fun as they improve numeracy with "Number Bingo." They roll three dice, construct an equation from the numbers to produce a new number, then cover that number on the board, trying to get 4 in a row.

Numeracy is the ability to reason and to apply simple numerical concepts.[1] Basic numeracy skills consist of comprehending fundamental arithmetics like addition, subtraction, multiplication, and division. For example, if one can understand simple mathematical equations such as, 2 + 2 = 4, then one would be considered possessing at least basic numeric knowledge. Substantial aspects of numeracy also include number sense, operation sense, computation, measurement, geometry, probability and statistics. A numerically literate person can manage and respond to the mathematical demands of life.[2] By contrast, innumeracy (the lack of numeracy) can have a negative impact. Numeracy has an influence on career professions, literacy, and risk perception towards health decisions. Low numeracy distorts risk perception towards health decisions[3] and may negatively affect economic choices.[4][5] "Greater numeracy has been associated with reduced susceptibility to framing effects, less influence of nonnumerical information such as mood states, and greater sensitivity to different levels of numerical risk".[6]

Representation of numbers

Humans have evolved to mentally represent numbers in two major ways from observation (not formal math).[7] These representations are often thought to be innate[8] (see Numerical cognition), to be shared across human cultures,[9] to be common to multiple species,[10] and not to be the result of individual learning or cultural transmission. They are:
  1. Approximate representation of numerical magnitude, and
  2. Precise representation of the quantity of individual items.
Approximate representations of numerical magnitude imply that one can relatively estimate and comprehend an amount if the number is large (see Approximate number system). For example, one experiment showed children and adults arrays of many dots.[7] After briefly observinging them, both groups could accurately estimate the approximate number of dots. However, distinguishing differences between large numbers of dots proved to be more challenging.[7]

Precise representations of distinct individuals demonstrate that people are more accurate in estimating amounts and distinguishing differences when the numbers are relatively small (see Subitizing).[7] For example, in one experiment, an experimenter presented an infant with two piles of crackers, one with two crackers the other with three. The experimenter then covered each pile with a cup. When allowed to choose a cup, the infant always chose the cup with more crackers because the infant could distinguish the difference.[7]

Both systems—approximate representation of magnitude and precise representation quantity of individual items—have limited power. For example, neither allows representations of fractions or negative numbers. More complex representations require education. However, achievement in school mathematics correlates with an individual's unlearned approximate number sense).[11]

Definitions and assessment

Fundamental (or rudimentary) numeracy skills include understanding of the real number line, time, measurement, and estimation.[3] Fundamental skills include basic skills (the ability to identify and understand numbers) and computational skills (the ability to perform simple arithmetical operations and compare numerical magnitudes).

More sophisticated numeracy skills include understanding of ratio concepts (notably fractions, proportions, percentages, and probabilities), and knowing when and how to perform multistep operations.[3] Two categories of skills are included at the higher levels: the analytical skills (the ability to understand numerical information, such as required to interpret graphs and charts) and the statistical skills (the ability to apply higher probabilistic and statistical computation, such as conditional probabilities).

A variety of tests have been developed for assessing numeracy and health numeracy. [3][6][12][13] [14] [15] [16]

Childhood influences

The first couple of years of childhood are considered to be a vital part of life for the development of numeracy and literacy.[17] There are many components that play key roles in the development of numeracy at a young age, such as Socioeconomic Status (SES), parenting, Home Learning Environment (HLE), and age.[17]

Socioeconomic status

Children who are brought up in families with high SES tend to be more engaged in developmentally enhancing activities.[17] These children are more likely to develop the necessary abilities to learn and to become more motivated to learn.[17] More specifically, a mother's education level is considered to have an effect on the child's ability to achieve in numeracy. That is, mothers with a high level of education will tend to have children who succeed more in numeracy.[17]

Parenting

Parents are suggested to collaborate with their child in simple learning exercises, such as reading a book, painting, drawing, and playing with numbers. On a more expressive note, the act of using complex language, being more responsive towards the child, and establishing warm interactions are recommended to parents with the confirmation of positive numeracy outcomes.[17] When discussing beneficial parenting behaviors, a feedback loop is formed because pleased parents are more willing to interact with their child, which in essence promotes better development in the child.[17]

Home-learning environment

Along with parenting and SES, a strong home-learning environment increases the likelihood of the child being prepared for comprehending complex mathematical schooling.[18] For example, if a child is influenced by many learning activities in the household, such as puzzles, coloring books, mazes, or books with picture riddles, then they will be more prepared to face school activities.[18]

Age

Age is accounted for when discussing the development of numeracy in children.[18] Children under the age of 5 have the best opportunity to absorb basic numeracy skills.[18] After the age of 7, achievement of basic numeracy skills become less influential.[18] For example, a study was conducted to compare the reading and mathematic abilities between children, ages 5 and 7, each in three different mental capacity groups (underachieving, average, and overachieving). The differences in the amount of knowledge retained were greater between the three different groups at age 5, than between the groups at age 7. This reveals that the younger you are the greater chance you have to retain more information, like numeracy.

Literacy

There seems to be a relationship between literacy and numeracy,[19] [20] which can be seen in young children. Depending on the level of literacy or numeracy at a young age, one can predict the growth of literacy and/ or numeracy skills in future development.[21] There is some evidence that humans may have an inborn sense of number. In one study for example, five-month-old infants were shown two dolls, which were then hidden with a screen. The babies saw the experimenter pull one doll from behind the screen. Without the child's knowledge, a second experimenter could remove, or add dolls, unseen behind the screen. When the screen was removed, the infants showed more surprise at an unexpected number (for example, if there were still two dolls). Some researchers have concluded that the babies were able to count, although others doubt this and claim the infants noticed surface area rather than number.[22]

Employment

Numeracy has a huge impact on employment.[23] In a work environment, numeracy can be a controlling factor affecting career achievements and failures.[23] Many professions require individuals to have a well-developed sense of numeracy, for example: mathematician, physicist, accountant, actuary, Risk Analyst, financial analyst, engineer, and architect. Even outside these specialized areas, the lack of proper numeracy skills can reduce employment opportunities and promotions, resulting in unskilled manual careers, low-paying jobs, and even unemployment.[24]
For example, carpenters and interior designers need to be able to measure, use fractions, and handle budgets.[25] Another example pertaining to numeracy influencing employment was demonstrated at the Poynter Institute. The Poynter Institute has recently included numeracy as one of the skills required by competent journalists. Max Frankel, former executive editor of the New York Times, argues that "deploying numbers skillfully is as important to communication as deploying verbs". Unfortunately, it is evident that journalists often show poor numeracy skills. In a study by the Society of Professional Journalists, 58% of job applicants interviewed by broadcast news directors lacked an adequate understanding of statistical materials.[26]

With regards to assessing applicants for an employment position, psychometric numerical reasoning tests have been created by occupational psychologists, who are involved in the study of numeracy. These psychometric numerical reasoning tests are used to assess an applicants' ability to comprehend and apply numbers. These tests are sometimes administered with a time limit, resulting in the need for the test-taker to think quickly and concisely. Research has shown that these tests are very useful in evaluating potential applicants because they do not allow the applicants to prepare for the test, unlike interview questions. This suggests that an applicant's results are reliable and accurate.[27]

These psychometric numerical reasoning tests first became prevalent during the 1980s, following the pioneering work of psychologists, such as P. Kline. In 1986 P. Kline's published a book entitled, "A handbook of test construction: Introduction to psychometric design", which explained that psychometric testing could provide reliable and objective results. These findings could then be used to effectively assess a candidate's abilities in numeracy. In the future, psychometric numerical reasoning tests will continue to be used in employment assessments to fairly and accurately differentiate and evaluate possible employment applicants.

Innumeracy and dyscalculia

Innumeracy is a neologism coined by an analogue with illiteracy. Innumeracy refers to a lack of ability to reason with numbers. The term innumeracy was coined by cognitive scientist Douglas Hofstadter. However, this term was popularized in 1989 by mathematician John Allen Paulos in his book entitled, Innumeracy: Mathematical Illiteracy and its Consequences.

Developmental dyscalculia refers to a persistent and speciļ¬c impairment of basic numerical-arithmetical skills learning in the context of normal intelligence.

Patterns and differences

The root cause of innumeracy varies. Innumeracy has been seen in those suffering from poor education and childhood deprivation of numeracy.[28] Innumeracy is apparent in children during the transition of numerical skills obtained before schooling and the new skills taught in the education departments because of their memory capacity to comprehend the material.[28] Patterns of innumeracy have also been observed depending on age, gender, and race.[29] Older adults have been associated with lower numeracy skills than younger adults.[29] Men have been identified to have higher numeracy skills than women.[23] Some studies seem to indicate young people of African heritage tend to have lower numeracy skills.[29] The Trends in International Mathematics and Science Study (TIMSS) in which children at fourth-grade (average 10 to 11 years) and eighth-grade (average 14 to 15 years) from 49 countries were tested on mathematical comprehension. The assessment included tests for number, algebra (also called patterns and relationships at fourth grade), measurement, geometry, and data. The latest study, in 2003 found that children from Singapore at both grade levels had the highest performance. Countries like Hong Kong SAR, Japan, and Taiwan also shared high levels of numeracy. The lowest scores were found in countries like South Africa, Ghana, and Saudi Arabia. Another finding showed a noticeable difference between boys and girls with some exceptions. For example, girls performed significantly better in Singapore, and boys performed significantly better in the United States.[7]

Theory

There is a theory that innumeracy is more common than illiteracy when dividing cognitive abilities into two separate categories. David C. Geary, a notable cognitive developmental and evolutionary psychologist from the University of Missouri, created the terms "biological primary abilities" and "biological secondary abilities".[28] Biological primary abilities evolve over time and are necessary for survival. Such abilities include speaking a common language or knowledge of simple mathematics.[28] Biological secondary abilities are attained through personal experiences and cultural customs, such as reading or high level mathematics learned through schooling.[28] Literacy and numeracy are similar in the sense that they are both important skills used in life. However, they differ in the sorts of mental demands each makes. Literacy consists of acquiring vocabulary and grammatical sophistication, which seem to be more closely related to memorization, whereas numeracy involves manipulating concepts, such as in calculus or geometry, and builds from basic numeracy skills.[28] This could be a potential explanation of the challenge of being numerate.[28]

Innumeracy and risk perception in health decision-making

Health numeracy has been defined as "the degree to which individuals have the capacity to access, process, interpret, communicate, and act on numerical, quantitative, graphical, biostatistical, and probabilistic health information needed to make effective health decisions".[30] The concept of health numeracy is a component of the concept of health literacy. Health numeracy and health literacy can be thought of as the combination of skills needed for understanding risk and making good choices in health-related behavior.

Health numeracy requires basic numeracy but also more advanced analytical and statistical skills. For instance, health numeracy also requires the ability to understand probabilities or relative frequencies in various numerical and graphical formats, and to engage in Bayesian inference, while avoiding errors sometimes associated with Bayesian reasoning (see Base rate fallacy, Conservatism (Bayesian)). Health numeracy also requires understanding terms with definitions that are specific to the medical context. For instance, although 'survival' and 'mortality' are complementary in common usage, these terms are not complementary in medicine (see five-year survival rate).[31][32] Innumeracy is also a very common problem when dealing with risk perception in health-related behavior; it is associated with patients, physicians, journalists and policymakers.[29][32] Those who lack or have limited health numeracy skills run the risk of making poor health-related decisions because of an inaccurate perception of information.[17] For example, if a patient has been diagnosed with breast cancer, being innumerate may hinder her ability to comprehend her physician's recommendations or even the severity of the health concern. One study found that people tended to overestimate their chances of survival or even to choose lower quality hospitals.[23]
Innumeracy also makes it difficult or impossible for some patients to read medical graphs correctly.[33] Some authors have distinguished graph literacy from numeracy.[34] Indeed, many doctors exhibit innumeracy when attempting to explain a graph or statistics to a patient. Once again, a misunderstanding between a doctor and patient due to either the doctor, patient, or both being unable to comprehend numbers effectively could result in serious health consequences.

Different presentation formats of numerical information, for instance natural frequency icon arrays, have been evaluated to assist both low numeracy and high numeracy individuals.[29][35][36][37][38]

Silicon Valley


From Wikipedia, the free encyclopedia
 

Silicon Valley, as seen from over north San Jose, facing southbound towards Downtown San Jose

Downtown San Jose as seen with lit palm trees

Silicon Valley is a nickname for the southern portion of the San Francisco Bay Area of Northern California in the United States. It is home to many of the world's largest high tech corporations, as well as thousands of tech startup companies. The region occupies roughly the same area as the Santa Clara Valley where it is centered, including San Jose and surrounding cities and towns. The term originally referred to the region's large number of silicon chip innovators and manufacturers, but eventually came to refer to all high-tech businesses in the area, and is now generally used as a metonym for the American high-technology economic sector.

Silicon Valley is a leading hub for high-tech innovation and development, accounting for one-third of all of the venture capital investment in the United States. Geographically, Silicon Valley is generally thought to encompass all of the Santa Clara Valley, the southern San Francisco Peninsula, and southern portions of the East Bay.

Origin of the term

The term Silicon Valley is attributed to Ralph Vaerst, a local entrepreneur. Its first published use is credited to Don Hoefler, a friend of Vaerst's, who used the phrase as the title of a series of articles in the weekly trade newspaper Electronic News. The series, entitled "Silicon Valley in the USA", began in the paper's January 11, 1971 issue. The term gained widespread use in the early 1980s, at the time of the introduction of the IBM PC and numerous related hardware and software products to the consumer market. The Silicon part of the name refers to the high concentration of companies involved in the making of semiconductors (silicon is used to create most semiconductors commercially) and computer industries that were concentrated in the area. These firms slowly replaced the orchards and related agriculture and food production companies which gave the area its initial nickname — the "Valley of Heart's Delight."

History

"Perhaps the strongest thread that runs through the Valley's past and present is the drive to "play" with novel technology, which, when bolstered by an advanced engineering degree and channeled by astute management, has done much to create the industrial powerhouse we see in the Valley today."[1]

Background


Looking west over northern San Jose (downtown is at far left) and other parts of Silicon Valley
Stanford University, its affiliates, and graduates have played a major role in the development of this area.[2] Some examples include the work of Lee De Forest with his invention of a pioneering vacuum tube called the Audion and the oscilloscopes of Hewlett-Packard.

A very powerful sense of regional solidarity accompanied the rise of Silicon Valley. From the 1890s, Stanford University's leaders saw its mission as service to the West and shaped the school accordingly. At the same time, the perceived exploitation of the West at the hands of eastern interests fueled booster-like attempts to build self-sufficient indigenous local industry. Thus, regionalism helped align Stanford's interests with those of the area's high-tech firms for the first fifty years of Silicon Valley's development.[3]

During the 1940s and 1950s, Frederick Terman, as Stanford's dean of engineering and provost, encouraged faculty and graduates to start their own companies. He is credited with nurturing Hewlett-Packard, Varian Associates, and other high-tech firms, until what would become Silicon Valley grew up around the Stanford campus. Terman is often called "the father of Silicon Valley".[4]

In 1956 William Shockley, the creator of the transistor, moved from New Jersey to Mountain View, California to start Shockley Semiconductor Laboratory to live closer to his ailing mother in Palo Alto, California. Shockley's work served as the basis for many electronic developments for decades.[5][6]

During 1955–85, solid state technology research and development at Stanford University followed three waves of industrial innovation made possible by support from private corporations, mainly Bell Telephone Laboratories, Shockley Semiconductor, Fairchild Semiconductor, and Xerox PARC. In 1969, the Stanford Research Institute (now SRI International), operated one of the four original nodes that comprised ARPANET, predecessor to the Internet.[7]

Social roots of the information technology revolution

It was in Silicon Valley that the silicon-based integrated circuit, the microprocessor, and the microcomputer, among other key technologies, were developed. As of 2013 the region employed about a quarter of a million information technology workers.[8] Silicon Valley was formed as a milieu of innovations by the convergence on one site of new technological knowledge; a large pool of skilled engineers and scientists from major universities in the area; generous funding from an assured market with the Defense Department; the development of an efficient network of venture capital firms; and, in the very early stage, the institutional leadership of Stanford University.[9]

Roots in radio and military technology

The first ship-to-shore wireless telegraph message to be received in the US was from the San Francisco lightship outside the Golden Gate, signaling the return of the American fleet from the Philippines after their victory in the Spanish–American War. The ship had been outfitted with a wireless telegraph transmitter by a local newspaper, so that they could prepare a celebration on the return of the American sailors.[10]

The Bay Area had long been a major site of United States Navy research and technology. In 1909, Charles Herrold started the first radio station in the United States with regularly scheduled programming in San Jose. Later that year, Stanford University graduate Cyril Elwell purchased the U.S. patents for Poulsen arc radio transmission technology and founded the Federal Telegraph Corporation (FTC) in Palo Alto. Over the next decade, the FTC created the world's first global radio communication system, and signed a contract with the Navy in 1912.[1]

In 1933, Air Base Sunnyvale, California, was commissioned by the United States Government for use as a Naval Air Station (NAS) to house the airship USS Macon in Hangar One. The station was renamed NAS Moffett Field, and between 1933 and 1947, U.S. Navy blimps were based there.[11] A number of technology firms had set up shop in the area around Moffett Field to serve the Navy. When the Navy gave up its airship ambitions and moved most of its west coast operations to San Diego, the National Advisory Committee for Aeronautics (NACA, forerunner of NASA) took over portions of Moffett Field for aeronautics research. Many of the original companies stayed, while new ones moved in. The immediate area was soon filled with aerospace firms, such as Lockheed.

Ham radio

The Bay area was an early center of ham radio with about 10% of the operators in the United States. William Eitel, Jack McCullough, and Charles Litton, who together pioneered vacuum tube manufacturing in the Bay area, were hobbyists with training in technology gained locally who participated in development of shortwave radio by the ham radio hobby. High frequency, and especially, Very high frequency, VHF, transmission in the 10 meter band, required higher quality power tubes than were manufactured by the consortium of RCA, Western Electric, General Electric, Westinghouse which controlled vacuum tube manufacture. Litton, founder of Litton Industries, pioneered manufacturing techniques which resulted in award of wartime contracts to manufacture transmitting tubes for radar to Eitel-McCullough, a San Bruno firm, which manufactured power-grid tubes for radio amateurs and aircraft radio equipment.[12]

Sputnik

On October 4, 1957 the Soviet Union launched the first space satellite, Sputnik, which sparked fear that the Soviet Union was pulling ahead technologically. After President Eisenhower signed the National Aeronautics and Space Act (NASA), he turned to the only company in the world who were able to make transistors: Fairchild Semiconductor. The president funded their project. They were highly successful and their company was put on the map.[13]

Welfare capitalism

A union organizing drive in 1939-40 at Eitel-McCullough by the strong Bay area labor movement was fought off by adoption of a strategy of welfare capitalism which included pensions and other generous benefits, profit sharing, and such extras as a medical clinic and a cafeteria. An atmosphere of cooperation and collaboration was established,[14] Successes have been few and far between[15] for union organizing drives by UE and others in subsequent years.[16]

Stanford Industrial Park

After World War II, universities were experiencing enormous demand due to returning students. To address the financial demands of Stanford's growth requirements, and to provide local employment opportunities for graduating students, Frederick Terman proposed the leasing of Stanford's lands for use as an office park, named the Stanford Industrial Park (later Stanford Research Park). Leases were limited to high technology companies. Its first tenant was Varian Associates, founded by Stanford alumni in the 1930s to build military radar components. However, Terman also found venture capital for civilian technology start-ups. One of the major success stories was Hewlett-Packard. Founded in Packard's garage by Stanford graduates William Hewlett and David Packard, Hewlett-Packard moved its offices into the Stanford Research Park shortly after 1953. In 1954, Stanford created the Honors Cooperative Program to allow full-time employees of the companies to pursue graduate degrees from the University on a part-time basis. The initial companies signed five-year agreements in which they would pay double the tuition for each student in order to cover the costs. Hewlett-Packard has become the largest personal computer manufacturer in the world, and transformed the home printing market when it released the first thermal drop-on-demand ink jet printer in 1984.[17] Other early tenants included Eastman Kodak, General Electric, and Lockheed.[18]

The silicon transistor

In 1953, William Shockley left Bell Labs in a disagreement over the handling of the invention of the transistor. After returning to California Institute of Technology for a short while, Shockley moved to Mountain View, California, in 1956, and founded Shockley Semiconductor Laboratory. Unlike many other researchers who used germanium as the semiconductor material, Shockley believed that silicon was the better material for making transistors. Shockley intended to replace the current transistor with a new three-element design (today known as the Shockley diode), but the design was considerably more difficult to build than the "simple" transistor. In 1957, Shockley decided to end research on the silicon transistor. As a result of Shockley's abusive management style, eight engineers left the company to form Fairchild Semiconductor; Shockley referred to them as the "traitorous eight".
Two of the original employees of Fairchild Semiconductor, Robert Noyce and Gordon Moore, would go on to found Intel.[19][20]

Chips

In April 1974 Intel released the Intel 8080,[21] a "computer on a chip," "the first truly usable microprocessor." A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit (IC),[22]

Law firms

The rise of Silicon Valley was also bolstered by the development of appropriate legal infrastructure to support the rapid formation, funding, and expansion of high-tech companies, as well as the development of a critical mass of litigators and judges experienced in resolving disputes between such firms. From the early 1980s onward, many national (and later international) law firms opened offices in San Francisco and Palo Alto in order to provide Silicon Valley startups with legal services. Furthermore, California law has a number of quirks which help entrepreneurs establish startups at the expense of established firms, such as a nearly absolute ban on non-compete clauses in employment agreements.[citation needed]

Homebrew Computer Club


Invitation to first Homebrew Computer Club meeting (sent to Steve Dompier).

The Homebrew Computer Club was an informal group of electronic enthusiasts and technically minded hobbyists who gathered to trade parts, circuits, and information pertaining to DIY construction of computing devices.[23] It was started by Gordon French and Fred Moore who met at the Community Computer Center in Menlo Park. They both were interested in maintaining a regular, open forum for people to get together to work on making computers more accessible to everyone.[24]

The first meeting was held in March 1975 in French's garage in Menlo Park, San Mateo County, California, on the occasion of the arrival in the area of the first MITS Altair microcomputer, a unit sent for review by People's Computer Company. Steve Wozniak credits that first meeting with inspiring him to design the Apple I.[25] Subsequent meetings were held at an auditorium at the Stanford Linear Accelerator Center.[26]

Venture capital firms

By the early 1970s, there were many semiconductor companies in the area, computer firms using their devices, and programming and service companies serving both. Industrial space was plentiful and housing was still inexpensive. The growth was fueled by the emergence of the venture capital industry on Sand Hill Road, beginning with Kleiner Perkins in 1972; the availability of venture capital exploded after the successful $1.3 billion IPO of Apple Computer in December 1980.

Media

In 1980 Intelligent Machines Journal, a hobbyist journal, changed its name to InfoWorld, and, with offices in Palo Alto, began covering the explosive emergence of the microcomputer industry in the valley.[27]

Software

Although semiconductors are still a major component of the area's economy, Silicon Valley has been most famous in recent years for innovations in software and Internet services. Silicon Valley has significantly influenced computer operating systems, software, and user interfaces.

Using money from NASA, the US Air Force, and ARPA, Doug Engelbart invented the mouse and hypertext-based collaboration tools in the mid-1960s and 1970s while at Stanford Research Institute (now SRI International), first publicly demonstrated in 1968 in what is now known as The Mother of All Demos. Engelbart's Augmentation Research Center at SRI was also involved in launching the ARPANET (precursor to the Internet) and starting the Network Information Center (now InterNIC). Xerox hired some of Engelbart's best researchers beginning in the early 1970s. In turn, in the 1970s and 1980s, Xerox's Palo Alto Research Center (PARC) played a pivotal role in object-oriented programming, graphical user interfaces (GUIs), Ethernet, PostScript, and laser printers.

While Xerox marketed equipment using its technologies, for the most part its technologies flourished elsewhere. The diaspora of Xerox inventions led directly to 3Com and Adobe Systems, and indirectly to Cisco, Apple Computer, and Microsoft. Apple's Macintosh GUI was largely a result of Steve Jobs' visit to PARC and the subsequent hiring of key personnel.[28] Cisco's impetus stemmed from the need to route a variety of protocols over Stanford's campus Ethernet.

Internet bubble

Silicon Valley is generally considered to have been the center of the dot-com bubble, which started in the mid-1990s and collapsed after the NASDAQ stock market began to decline dramatically in April 2000. During the bubble era, real estate prices reached unprecedented levels. For a brief time, Sand Hill Road was home to the most expensive commercial real estate in the world, and the booming economy resulted in severe traffic congestion.

After the dot-com crash, Silicon Valley continues to maintain its status as one of the top research and development centers in the world. A 2006 The Wall Street Journal story found that 12 of the 20 most inventive towns in America were in California, and 10 of those were in Silicon Valley.[29] San Jose led the list with 3,867 utility patents filed in 2005, and number two was Sunnyvale, at 1,881 utility patents.[30]

Economy

According to a 2008 study by AeA in 2006, Silicon Valley was the third largest high-tech center (cybercity) in the United States, behind the New York metropolitan area and Washington metropolitan area, with 225,300 high-tech jobs. The Bay Area as a whole however, of which Silicon Valley is a part, would rank first with 387,000 high-tech jobs. Silicon Valley has the highest concentration of high-tech workers of any metropolitan area, with 285.9 out of every 1,000 private-sector workers. Silicon Valley has the highest average high-tech salary at $144,800.[31] Largely a result of the high technology sector, the San Jose-Sunnyvale-Santa Clara, CA Metropolitan Statistical Area has the most millionaires and the most billionaires in the United States per capita.[32]

The region is the biggest high-tech manufacturing center in the United States.[33][34] The unemployment rate of the region was 9.4% in January 2009, up from 7.8% in the previous month.[35] Silicon Valley received 41% of all U.S. venture investment in 2011, and 46% in 2012.[36]

Notable companies

Thousands of high technology companies are headquartered in Silicon Valley. Among those, the following are in the Fortune 1000: Additional notable companies headquartered (or with a significant presence) in Silicon Valley include (some defunct or subsumed):
Silicon Valley is also home to the high-tech superstore retail chain Fry's Electronics.

Notable government facilities

Demographics

Depending on what geographic regions are included in the meaning of the term, the population of Silicon Valley is between 3.5 and 4 million. A 1999 study by AnnaLee Saxenian for the Public Policy Institute of California reported that a third of Silicon Valley scientists and engineers were immigrants and that nearly a quarter of Silicon Valley's high-technology firms since 1980 were run by Chinese (17 percent) or Indian CEOs (7 percent).[37]

Diversity

According to 2014 article in the Washington Post, a case was presented for the imbalance in the ratio of women to men in leadership and technical positions in Silicon Valley firms.[38] The article states that although there are roughly equal numbers of men and women working in the tech industry, few women achieve leadership positions. This is also represented in the number of new companies founded by women as well as the number of women lead startups that receive venture capital funding. The author of the article, Vivek Wadhwa, stated that he believed that a contributing cause of this phenomenon was upbringing and parental influence. He stated, "...very few parents encourage their daughters to study science and engineering."[39] Vivak also cited the lack of female role models and noted that many famous tech leaders — Apple’s Steve Jobs, Microsoft’s Bill Gates, and Facebook’s Mark Zuckerberg — are male.[38]

As of 2014, in response to gender imbalance in their technical workforce, Silicon Valley firms such as Apple were working actively to prepare and recruit women. Bloomberg reported in October that Facebook, Google, and Microsoft attended the 20th annual Grace Hopper 'Women in Computing' conference to actively recruit and potentially hire female engineers and technology experts.[40]

Starting in May 2014, tech companies such as Google, Facebook, and Apple started to release corporate transparency reports and offer detailed employee breakdowns.[40] Facebook found that 77% of its senior-level employees were men and 74% were white. Overall, 41% of its employees were of Asian origin, while 2% of employees were black.[41] At Google, 17% of employees are women, and 72% of leadership positions are held by whites (Hispanic and Non-Hispanic).[42] These findings brought criticism by some for their low employment of female and black employees, compared to the over-representation of Asian-American employees compared to the national population, where 5% is of Asian descent.[43][44]

Municipalities

The following Santa Clara County cities are actually located in the Santa Clara Valley and based on that status are traditionally considered to be in Silicon Valley (in alphabetical order):
In 2015, MIT researchers developed a novel method for measuring which towns are home to startups with higher growth potential. This defines Silicon Valley to center on the municipalities of Menlo Park, Mountain View, Palo Alto, and Sunnyvale.[45][46]

The following Bay Area cities are (or were) home to various high-tech companies (or related firms like venture capital firms) and have thereby become associated with Silicon Valley:

Universities, colleges, and trade schools

Media outlets

Local and national media cover Silicon Valley and its companies. CNN, The Wall Street Journal, and Bloomberg News operate Silicon Valley bureaus out of Palo Alto. Public broadcaster KQED (TV) and KQED-FM, as well as the Bay Area's local ABC station KGO-TV, operate bureaus in San Jose. KNTV, NBC's local Bay Area affiliate "NBC Bay Area", is located in San Jose. Produced from this location is the nationally distributed TV Show "Tech Now" as well as the CNBC Silicon Valley bureau. San Jose-based media serving Silicon Valley include the San Jose Mercury News daily and the Metro Silicon Valley weekly. Specialty media include El Observador and the San Jose / Silicon Valley Business Journal. Most of the Bay Area's other major TV stations, newspapers, and media operate in San Francisco or Oakland. Patch.com operates paloalto.patch.com, mountainview.patch.com and others, providing local news, discussion and events for residents of Silicon Valley.

Fiber-optic breakthrough could lead to real AI

  • By on March 18, 2015 at 12:23 pm
  • Original link:  http://www.extremetech.com/extreme/201404-fiber-optic-breakthrough-could-lead-to-real-ai
Brain

What do Elon Musk, Bill Gates, and Steven Hawking all have in common? Judging by their outspoken remarks on the looming dangers of runaway AI, we might guess that they all suffer from insomnia. These guys seem to have realized something that would not be so apparent to ordinary folks like you or I, something that keeps them up at night: When this AI eventually gets around to properly redistributing human resources — namely our wealth and brainpower — who do you think will be first on the chopping block? Although no one today has any idea how to unleash this predicted AI, researchers from Nanyang Technical University in Singapore have developed a new technology they think is already capable of the ‘brain-like’ computing effects many have imagined for it.

The secret to their new device is something they call ‘photonic synapses’. They are exactly what they sound like: networks of light-bearing amorphous metal-sulphide microfibers that make synapses on to each other much like those found in actual brains. These unique microfibers aren’t something they just pulled out of thin air. The so-called ‘chalcogenoid’ alloys, alloys with elements like sulphur, selenium, or telluride, are prized in the fiber business for the unique optical properties they give to glasses made with them. The researchers here used gallium lanthanum oxysulphide (GLSO, syc) fibers which can be photodarkened by light itself.
Photonic Synapse
The effect can be made temporary or permanent, depending on the particular wavelength, duration, and intensity of light. Together with some way to physically combine these fibers into networks, that’s basically all you need to make a primitive neuromorphic computing machine. The researchers used light at a wavelength of 532nm to make the presynaptic portion of a junction, and light at 650nm for the actual axonal spike leg of the network. By adjusting the above mentioned parameters, the researcher could effectively depolarize or hyperpolarize (or at least the optical equivalent of an electrical polarization) the GLSO axon at any point along its length.
Axosynapse
This capability is in some sense even more flexible than that of a real axon, which generally does not receive many synapses, if any at all, along its length. Taken alone, and for reasons we will detail below, the picture at the top that the authors created to illustrate their mechanism does not give a lot of confidence that they are making brain-like computers just yet. That’s okay, because nobody ever said they have to make it just like a brain to do something useful. Their device may not have the ability to physically grow or shrink on demand in response to nearly every spike as real brains can. But on the other hand, optical fiber can not only send millions more signals per second than real neurons, it can send them at the speed of light, and maybe — a big maybe — do it with less energy per spike.

What makes the picture slightly amateurish to the neuroscientist is that they drew it with the presynapses on the axon instead of the dendrite. When that actually happens in real neurons, it is quite a big deal. Such a big deal that folks will even write whole stories about the so-called chandelier cells that manage to do it. A more confidence-instilling depiction might introduce some directionality to the signal flow at the level of the dendrites. But as we said, for proof-of-principle that is fine. What is the tougher challenge will be getting any ‘brainlike’ computing capability, or even rudimentary AI functions like vehicle control, speech, or search, from the thing.


We mentioned Musk at the beginning because he has a lot at stake on the issue, and therefore it seems, a lot to say.
His original stance that ‘AI is more dangerous than nuclear weapons’ has perhaps wisely softened to ‘someday human-driven cars may be outlawed.’ More recently, he has assured us that ‘we don’t have to worry about autonomous cars’, something that is really not all ‘that difficult to build.’ The message that we are taking home at this point in the game, is that we don’t have to worry much about photonic-based AI yet either.

We are big fans of new automotive technology here at ET, particularly things like smart cars or flight-worthy conceptions like Aeromobile. However, if the first flying car needs to also be the first fully autonomously controlled vehicle, it will need nothing short of brain-like photonic processors to make anybody believe it.

This new work is a much-needed hardware breakthrough to begin to discover just what can be done with simple neuron elements that are physically realized, as opposed to just being simulated on a von Neumann-style programmable machine. We had hoped that memristor or even neuristor technology would have evolved to the point of a practical robotic implementation by now, but progress has been slow. An optical technology, while potentially faster and more economical than semiconductor incarnations, may not lead immediately to new discoveries in the basic principles of neural computing architectures. It could, however, bring new players into the field, and breathe new life into these kinds of efforts.

Memory and trauma

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Memory_and_trauma ...