Search This Blog

Sunday, December 26, 2021

Sources of international law

From Wikipedia, the free encyclopedia

International law also known as "law of nations" is the name of a body of rules which regulate the conduct of sovereign states in their relations with one another. EditingSources of international law include treaties, international customs, general widely recognized principles of law, the decisions of national and lower courts, and scholarly writings. They are the materials and processes out of which the rules and principles regulating the international community are developed. They have been influenced by a range of political and legal theories.

Modern views

Article 38(1) of the Statute of the International Court of Justice is generally recognized as a definitive statement of the sources of international law. It requires the Court to apply, among other things, (a) international conventions, whether general or particular, establishing rules expressly recognized by the contesting states; (b) international custom, as evidence of a general practice accepted as law; (c) the general principles of law recognized by civilized nations; (d) subject to the provisions of Article 59, judicial decisions and the teachings of the most highly qualified publicists of the various nations, as subsidiary means for the determination of rules of law.

During the 19th century, it was recognized by legal positivists that a sovereign could limit its authority to act by consenting to an agreement according to the principle pacta sunt servanda. This consensual view of international law was reflected in the 1920 Statute of the Permanent Court of International Justice, and was later preserved in Article 38(1) of the 1946 Statute of the International Court of Justice.

The core of broad principles of law is general and dynamic, and they can sometimes be reduced to a proverb or a basic idea. Unlike other types of regulations, such as ordered law or agreements, broad standards of law have not been "established" according to the right sources of law. General norms of law, on the other hand, are regarded as a component of positive law, even if they are only used as auxiliary devices. They define critical principles for the framework's actual operation and, in general, are drafted from the lawful thinking of those entitled to make valid decisions when administering the law, namely the legal executive. They likewise incorporate integrative gadgets of the framework as they fill existing or possible lawful deficiencies. General standards of law have been the subject of extensive doctrinal debate in international law, owing to the various connotations attributed to the concept and the hypothetical concerns that they raise. The use of the expression "central standards of international law," which is at the top of the overall set of laws and begins in settlement or custom (e.g., the guideline of sovereign correspondence of states or the rule of the forbiddance of danger or the use of power), and which will not be managed here, causes a lot of confusion. Given the language used in Article 38, paragraph 1(c) of the Statute of the International Court of Justice. ("universal standards of law as recognised by acculturated countries"), the beginning of universal standards of law as applied at the global level has also been a source of debate. The conventional wisdom holds that these standards have their origins in homegrown general systems of laws. Once it is established that some of these broad instruments are frequently shared rules found in domestic systems, they can be utilised in international law as well. They are rational derivations that can be found in any overall group of laws: the standard of restitution for harm committed, the standard of rule understanding, or those used for the purpose of rule struggles—many of them known through Latin adages—are true models. Various general legal standards, such as “audiatur et altera” standards, “actori incumbit onus probandi”, or the method that the designated authority of benefits is also judge of the coincidental locale, have been promoted by the legal executive polacy is very importent of any war.

Hierarchy

On the question of preference between sources of international law, rules established by treaty will take preference if such an instrument exists. It is also argued however that international treaties and international custom are sources of international law of equal validity; this is that new custom may supersede older treaties and new treaties may override older custom. Also, jus cogens (peremptory norm) is a custom, not a treaty. Certainly, judicial decisions and juristic writings are regarded as auxiliary sources of international law, whereas it is unclear whether the general principles of law recognized by 'civilized nations' should be recognized as a principal or auxiliary source of international law. Nevertheless, treaty, custom, and general principles of law are generally recognized as primary sources of international law.

Treaties as law

Treaties and conventions are the persuasive source of international law and are considered "hard law." Treaties can play the role of contracts between two or more parties, such as an extradition treaty or a defense pact. Treaties can also be legislation to regulate a particular aspect of international relations or form the constitutions of international organizations. Whether or not all treaties can be regarded as sources of law, they are sources of obligation for the parties to them. Article 38(1)(a) of the ICJ Statute, which uses the term "international conventions", concentrates upon treaties as a source of contractual obligation but also acknowledges the possibility of a state expressly accepting the obligations of a treaty to which it is not formally a party.

For a treaty-based rule to be a source of law, rather than simply a source of obligation, it must either be capable of affecting non-parties or have consequences for parties more extensive than those specifically imposed by the treaty itself.

Thus, the procedures or methods by treaties become legally binding are formal source of law which is a process by a legal rule comes into existence: it is law creating.

Treaties as custom

Some treaties are the result of codifying existing customary law, such as laws governing the global commons, and jus ad bellum. While the purpose is to establish a code of general application, its effectiveness depends upon the number of states that ratify or accede to the particular convention. Relatively few such instruments have a sufficient number of parties to be regarded as international law in their own right. The most obvious example is the 1949 Geneva Conventions for the Protection of War Victims.

Most multi-lateral treaties fall short of achieving such a near-universal degree of formal acceptance and are dependent upon their provisions being regarded as representing customary international law and, by this indirect route, as binding upon non-parties. This outcome is possible in a number of ways:

  • When the treaty rule reproduces an existing rule of customary law, the rule will be clarified in terms of the treaty provision. A notable example is the Vienna Convention on the Law of Treaties 1969, which was considered by the ICJ to be law even before it had been brought into force.
  • When a customary rule is in the process of development, its incorporation in a multilateral treaty may have the effect of consolidating or crystallizing the law in the form of that rule. It is not always easy to identify when this occurs. Where the practice is less developed, the treaty provision may not be enough to crystallize the rule as part of customary international law.
  • Even if the rule is new, the drafting of the treaty provision may be the impetus for its adoption in the practice of states, and it is the subsequent acceptance of the rule by states that renders it effective as part of customary law. If a broad definition is adopted of state practice, the making of a treaty would fall within the definition. Alternatively, it is possible to regard the treaty as the final act of state practice required to establish the rule in question, or as the necessary articulation of the rule to give it the opinio juris of customary international law.
  • Convention-based "instant custom" has been identified by the ICJ on several occasions as representing customary law without explanation of whether the provision in question was supported by state practice. This has happened with respect to a number of provisions of the Vienna Convention on the Law of Treaties 1969. If "instant custom" is valid as law, it could deny to third parties the normal consequences of non-accession.

The United Nations Charter

Pursuant to Chapter XVI, Article 103 of the United Nations Charter, the obligations under the United Nations Charter overrides the terms of any other treaty. Meanwhile, its Preamble affirms the establishment of the obligations out of treaties and source of international law.

International custom

Article 38(1)(b) of the ICJ Statute refers to "international custom" as a source of international law, specifically emphasizing the two requirements of state practice plus acceptance of the practice as obligatory or opinio juris sive necessitatis (usually abbreviated as opinio juris).

Derived from the consistent practice of (originally) Western states accompanied by opinio juris (the conviction of States that the consistent practice is required by a legal obligation), customary international law is differentiated from acts of comity (mutual recognition of government acts) by the presence of opinio juris (although in some instances, acts of comity have developed into customary international law, i.e. diplomatic immunity). Treaties have gradually displaced much customary international law. This development is similar to the replacement of customary or common law by codified law in municipal legal settings, but customary international law continues to play a significant role in international law.

State practice

When examining state practice to determine relevant rules of international law, it is necessary to take into account every activity of the organs and officials of states that relate to that purpose. There has been continuing debate over where a distinction should be drawn as to the weight that should be attributed to what states do, rather than what they say represents the law. In its most extreme form, this would involve rejecting what states say as practice and relegating it to the status of evidence of opinio juris. A more moderate version would evaluate what a state says by reference to the occasion on which the statement was made. It is only relatively powerful countries with extensive international contacts and interests that have regular opportunities of contributing by deed to the practice of international law. The principal means of contribution to state practice for the majority of states will be at meetings of international organizations, particularly the UN General Assembly, by voting and otherwise expressing their view on matters under consideration. Moreover, there are circumstances in which what states say may be the only evidence of their view as to what conduct is required in a particular situation.

The notion of practice establishing a customary rule implies that the practice is followed regularly, or that such state practice must be "common, consistent and concordant". Given the size of the international community, the practice does not have to encompass all states or be completely uniform. There has to be a sufficient degree of participation, especially on the part of states whose interests are likely to be most affected, and an absence of substantial dissent. There have been a number of occasions on which the ICJ has rejected claims that a customary rule existed because of a lack of consistency in the practice brought to its attention.

Within the context of a specific dispute, however, it is not necessary to establish the generality of practice. A rule may apply if a state has accepted the rule as applicable to it individually, or because the two states belong to a group of states between which the rule applies.

A dissenting state is entitled to deny the opposability of a rule in question if it can demonstrate its persistent objection to that rule, either as a member of a regional group or by virtue of its membership of the international community. It is not easy for a single state to maintain its dissent. Also, rules of the jus cogens have a universal character and apply to all states, irrespective of their wishes.

Demand for rules that are responsive to increasingly rapid changes has led to the suggestion that there can be, in appropriate circumstances, such a concept as "instant custom". Even within traditional doctrine, the ICJ has recognized that passage of a short period of time is not necessarily a bar to the formation of a new rule. Because of this, the question is sometimes raised as to whether the word "custom" is suitable to a process that could occur with great rapidity.

Practice by international organizations

It may be argued that the practice of international organizations, most notably that of the United Nations, as it appears in the resolutions of the Security Council and the General Assembly, are an additional source of international law, even though it is not mentioned as such in Article 38(1) of the 1946 Statute of the International Court of Justice. Article 38(1) is closely based on the corresponding provision of the 1920 Statute of the Permanent Court of International Justice, thus predating the role that international organizations have come to play in the international plane. That is, the provision of Article 38(1) may be regarded as 'dated, and this can most vividly be seen in the mention made of 'civilized nations', a mentioning that appears all the more quaint after the decolonization process that took place in the early 1960s and the participation of nearly all nations of the world in the United Nations.

Opinio juris

A wealth of state practice does not usually carry with it a presumption that opinio juris exists. “Not only must the acts concerned amount to a settled practice, but they must also be such, or be carried out in such a way, as to be evidence of a belief that this practice is rendered obligatory by the existence of a rule of law requiring it.”

In cases where practice (of which evidence is given) comprises abstentions from acting, consistency of conduct might not establish the existence of a rule of customary international law. The fact that no nuclear weapons have been used since 1945, for example, does not render their use illegal on the basis of a customary obligation because the necessary opinio juris was lacking.

Although the ICJ has frequently referred to opinio juris as being an equal footing with state practice, the role of the psychological element in the creation of customary law is uncertain.

Jus cogens

A peremptory norm or jus cogens (Latin for "compelling law" or "strong law") is a principle of international law considered so fundamental that it overrides all other sources of international law, including even the Charter of the United Nations. The principle of jus cogens is enshrined in Article 53 of the Vienna Convention on the Law of Treaties:

For the purposes of the present Convention, a peremptory norm of general international law is a norm accepted and recognised by the international community of States as a whole as a norm from which no derogation is permitted and which can be modified only by a subsequent norm of general international law having the same character.

Rules of jus cogens generally require or forbid the state to do particular acts or respect certain rights. However, some define criminal offenses which the state must enforce against individuals. Generally included on lists of such norms are prohibitions of such crimes and internationally wrongful acts as waging aggressive war, war crimes, crimes against humanity, piracy, genocide, apartheid, slavery and torture.

The evidence supporting the emergence of a rule of jus cogens will be essentially similar to that required to establish the creation of a new rule of customary international law. Indeed, jus cogens could be thought of as a special principle of custom with a superadded opinions juries. The European Court of Human Rights has stressed the international public policy aspect of the jus cogens.

General principles of law

The scope of general principles of law, to which Article 38(1) of the Statute of the ICJ refers, is unclear and controversial but may include such legal principles that are common to a large number of systems of municipal law. Given the limits of treaties or custom as sources of international law, Article 38(1) may be looked upon as a directive to the Court to fill any gap in the law and prevent a nonliquet by reference to the general principles.

In earlier stages of the development of international law, rules were frequently drawn from municipal law. In the 19th century, legal positivists rejected the idea that international law could come from any source that did not involve state will or consent but were prepared to allow for the application of general principles of law, provided that they had in some way been accepted by states as part of the legal order. Thus Article 38(1)(c), for example, speaks of general principles "recognized" by states. An area that demonstrates the adoption of municipal approaches is the law applied to the relationship between international officials and their employing organizations, although today the principles are regarded as established international law.

The significance of general principles has undoubtedly been lessened by the increased intensity of treaty and institutional relations between states. Nevertheless, the concepts of estoppel and equity have been employed in the adjudication of international disputes. For example, a state that has, by its conduct, encouraged another state to believe in the existence of a certain legal or factual situation, and to rely on that belief, may be estopped from asserting a contrary situation in its dealings. The principle of good faith was said by the ICJ to be "[o]ne of the basic principles governing the creation and performance of legal obligations". Similarly, there have been frequent references to equity. It is generally agreed that equity cannot be employed to subvert legal rules (that is, operate contra legem). This "equity as law" perception is reinforced by references to equitable principles in the text of the United Nations Convention on the Law of the Sea 1982, though this may be little more than an admission as to the existence, and legitimation, of the discretion of the adjudicator.

However, the principles of estoppel and equity in the international context do not retain all the connotations they do under common law. The reference to the principles as "general" signify that, if rules were to be adapted from municipal law, they should be at a sufficient level of generality to encompass similar rules existing in many municipal systems. Principles of municipal law should be regarded as sources of inspiration rather than as sources of rules of direct application.  

Judicial decisions and juristic writings

According to Article 38(1)(d) of its Statute, the ICJ is also to apply "judicial decisions and the teachings of the most highly qualified publicists of the various nations, as subsidiary means for the determination of rules of law". It is difficult to tell what influence these materials have on the development of the law. Pleadings in cases before the ICJ are often replete with references to case law and to legal literature.

Judicial decisions

The decisions of international and municipal courts and the publications of academics can be referred to, not as a source of law as such, but as a means of recognizing the law established in other sources. In practice, the International Court of Justice does not refer to domestic decisions although it does invoke its previous case-law.

There is no rule of stare decisis in international law. The decision of the Court has no binding force except between the parties and in respect of that particular case. Nevertheless, often the Court would refer to its past decisions and advisory opinions to support its explanation of a present case.

Often the International Court of Justice will consider General Assembly resolutions as indicative of customary international law.

Juristic writings

Article 38(1)(d) of the International Court of Justice Statute states that the 'teachings of the most highly qualified publicists of the various nations' are also among the 'subsidiary means for the determination of the rules of law'. The scholarly works of prominent jurists are not sources of international law but are essential in developing the rules that are sourced in treaties, custom and the general principles of law. This is accepted practice in the interpretation of international law and was utilized by the United States Supreme Court in The Paquete Habana case (175 US (1900) 677 at 700-1).

Informed consent

From Wikipedia, the free encyclopedia
 
Example of informed consent document from the PARAMOUNT trial

Informed consent is a principle in medical ethics and medical law that a patient should have sufficient information before making their own free decisions about their medical care. A healthcare provider is often held to have a responsibility to ensure that the consent that a patient gives is informed, and informed consent can apply to a health care intervention on a person, conducting some form of research on a person, or for disclosing a person's information. A health care provider may ask a patient to consent to receive therapy before providing it, a clinical researcher may ask a research participant before enrolling that person into a clinical trial, and a researcher may ask a research participant before starting some form of controlled experiment. Informed consent is collected according to guidelines from the fields of medical ethics and research ethics.

Free consent is a cognate term enshrined in the International Covenant on Civil and Political Rights. The covenant was adopted in 1966 by the United Nations, and intended to be in force by 23 March 1976. Article seven prohibits experiments conducted without the "free consent to medical or scientific experimentation" of the subject. As of September 2019, the covenant has 173 parties and six more signatories without ratification.

Informed consent can be said to have been given based upon a clear appreciation and understanding of the facts, implications, and consequences of an action. To give informed consent, the individual concerned must have adequate reasoning faculties and be in possession of all relevant facts. Impairments to reasoning and judgment that may prevent informed consent include basic intellectual or emotional immaturity, high levels of stress such as post-traumatic stress disorder or a severe intellectual disability, severe mental disorder, intoxication, severe sleep deprivation, Alzheimer's disease, or being in a coma.

Obtaining informed consent is not always required. If an individual is considered unable to give informed consent, another person is generally authorized to give consent on his behalf, e.g., parents or legal guardians of a child (though in this circumstance the child may be required to provide informed assent) and conservators for the mentally disordered, or consent can be assumed through the doctrine of implied consent, e.g., when an unconscious person will die without immediate medical treatment.

In cases where an individual is provided insufficient information to form a reasoned decision, serious ethical issues arise. Such cases in a clinical trial in medical research are anticipated and prevented by an ethics committee or institutional review board.

Informed consent form templates can be found on the website of the World Health Organization.

Assessment

Informed consent can be complex to evaluate, because neither expressions of consent, nor expressions of understanding of implications, necessarily mean that full adult consent was in fact given, nor that full comprehension of relevant issues is internally digested. Consent may be implied within the usual subtleties of human communication, rather than explicitly negotiated verbally or in writing. In some cases consent cannot legally be possible, even if the person protests he does indeed understand and wish. There are also structured instruments for evaluating capacity to give informed consent, although no ideal instrument presently exists.

Thus, there is always a degree to which informed consent must be assumed or inferred based upon observation, or knowledge, or legal reliance. This especially is the case in sexual or relational issues. In medical or formal circumstances, explicit agreement by means of signature—normally relied on legally—regardless of actual consent, is the norm. This is the case with certain procedures, such as a "do not resuscitate" directive that a patient signed before onset of their illness.

Brief examples of each of the above:

  1. A person may verbally agree to something from fear, perceived social pressure, or psychological difficulty in asserting true feelings. The person requesting the action may honestly be unaware of this and believe the consent is genuine, and rely on it. Consent is expressed, but not internally given.
  2. A person may claim to understand the implications of some action, as part of consent, but in fact has failed to appreciate the possible consequences fully and may later deny the validity of the consent for this reason. Understanding needed for informed consent is present but is, in fact (through ignorance), not present.
  3. A person signs a legal release form for a medical procedure, and later feels he did not really consent. Unless he can show actual misinformation, the release is usually persuasive or conclusive in law, in that the clinician may rely legally upon it for consent. In formal circumstances, a written consent usually legally overrides later denial of informed consent (unless obtained by misrepresentation).
  4. Informed consent in the U.S. can be overridden in emergency medical situations pursuant to 21CFR50.24, which was first brought to the general public's attention via the controversy surrounding the study of Polyheme.

Valid elements

For an individual to give valid informed consent, three components must be present: disclosure, capacity and voluntariness.

  • Disclosure requires the researcher to supply each prospective subject with the information necessary to make an autonomous decision and also to ensure that the subject adequately understands the information provided. This latter requirement implies that a written consent form be written in lay language suited for the comprehension skills of subject population, as well as assessing the level of understanding through conversation (to be informed).
  • Capacity pertains to the ability of the subject to both understand the information provided and form a reasonable judgment based on the potential consequences of his/her decision.
  • Voluntariness refers to the subject's right to freely exercise his/her decision making without being subjected to external pressure such as coercion, manipulation, or undue influence.

Waiver of requirement

Waiver of the consent requirement may be applied in certain circumstances where no foreseeable harm is expected to result from the study or when permitted by law, federal regulations, or if an ethical review committee has approved the non-disclosure of certain information.

Besides studies with minimal risk, waivers of consent may be obtained in a military setting. According to 10 USC 980, the United States Code for the Armed Forces, Limitations on the Use of Humans as Experimental Subjects, a waiver of advanced informed consent may be granted by the Secretary of Defense if a research project would:

  1. Directly benefit subjects.
  2. Advance the development of a medical product necessary to the military.
  3. Be carried out under all laws and regulations (i.e., Emergency Research Consent Waiver) including those pertinent to the FDA.

While informed consent is a basic right and should be carried out effectively, if a patient is incapacitated due to injury or illness, it is still important that patients benefit from emergency experimentation. The Food and Drug Administration (FDA) and the Department of Health and Human Services (DHHS) joined to create federal guidelines to permit emergency research, without informed consent. However, they can only proceed with the research if they obtain a waiver of informed consent (WIC) or an emergency exception from informed consent (EFIC).

21st Century Cures Act

The 21st Century Cures Act enacted by the 114th United States Congress in December 2016 allows researchers to waive the requirement for informed consent when clinical testing "poses no more than minimal risk" and "includes appropriate safeguards to protect the rights, safety, and welfare of the human subject."

Medical sociology

Medical sociologist have studied informed consent as well bioethics more generally. Oonagh Corrigan, looking at informed consent for research in patients, argues that much of the conceptualization of informed consent comes from research ethics and bioethics with a focus on patient autonomy, and notes that this aligns with a neoliberal worldview. Corrigan argues that a model based solely around individual decision making does not accurately describe the reality of consent because of social processes: a view that has started to be acknowledged in bioethics. She feels that the liberal principles of informed consent are often in opposition with autocratic medical practices such that norms values and systems of expertise often shape and individuals ability to apply choice.

Patients who agree to participate in trials often do so because they feel that the trial was suggested by a doctor as the best intervention. Patients may find being asked to consent within a limited time frame a burdensome intrusion on their care when it arises because a patient has to deal with a new condition. Patients involved in trials may not be fully aware of the alternative treatments, and an awareness that there is uncertainty in the best treatment can help make patients more aware of this. Corrigan notes that patients generally expect that doctors are acting exclusively in their interest in interactions and that this combined with "clinical equipose" where a healthcare practictioner does not know which treatment is better in a randomized control trial can be harmful to the doctor-patient relationship.

History

old paper document
Spanish
 
old paper document
English
 
Walter Reed authored these informed consent documents in 1900 for his research on yellow fever

Informed consent is a technical term first used by attorney, Paul G. Gebhard, in a medical malpractice United States court case in 1957. In tracing its history, some scholars have suggested tracing the history of checking for any of these practices:

  1. A patient agrees to a health intervention based on an understanding of it.
  2. The patient has multiple choices and is not compelled to choose a particular one.
  3. The consent includes giving permission.

These practices are part of what constitutes informed consent, and their history is the history of informed consent. They combine to form the modern concept of informed consent—which rose in response to particular incidents in modern research. Whereas various cultures in various places practiced informed consent, the modern concept of informed consent was developed by people who drew influence from Western tradition.

Medical history

In this Ottoman Empire document from 1539 a father promises to not sue a surgeon in case of death following the removal of his son's urinary stones.

Historians cite a series of medical guidelines to trace the history of informed consent in medical practice.

The Hippocratic Oath, a Greek text dating to 500 B.C.E., was the first set of Western writings giving guidelines for the conduct of medical professionals. It advises that physicians conceal most information from patients to give the patients the best care. The rationale is a beneficence model for care—the doctor knows better than the patient, and therefore should direct the patient's care, because the patient is not likely to have better ideas than the doctor.

Henri de Mondeville, a French surgeon who in the 14th century, wrote about medical practice. He traced his ideas to the Hippocratic Oath. Among his recommendations were that doctors "promise a cure to every patient" in hopes that the good prognosis would inspire a good outcome to treatment. Mondeville never mentioned getting consent, but did emphasize the need for the patient to have confidence in the doctor. He also advised that when deciding therapeutically unimportant details the doctor should meet the patients' requests "so far as they do not interfere with treatment".

In Ottoman Empire records there exists an agreement from 1539 in which negotiates details of a surgery, including fee and a commitment not to sue in case of death.[14] This is the oldest identified written document in which a patient acknowledges risk of medical treatment and writes to express their willingness to proceed.

Benjamin Rush was an 18th-century United States physician who was influenced by the Age of Enlightenment cultural movement. Because of this, he advised that doctors ought to share as much information as possible with patients. He recommended that doctors educate the public and respect a patient's informed decision to accept therapy. There is no evidence that he supported seeking a consent from patients. In a lecture titled "On the duties of patients to their physicians", he stated that patients should be strictly obedient to the physician's orders; this was representative of much of his writings. John Gregory, Rush's teacher, wrote similar views that a doctor could best practice beneficence by making decisions for the patients without their consent.

Thomas Percival was a British physician who published a book called Medical Ethics in 1803. Percival was a student of the works of Gregory and various earlier Hippocratic physicians. Like all previous works, Percival's Medical Ethics makes no mention of soliciting for the consent of patients or respecting their decisions. Percival said that patients have a right to truth, but when the physician could provide better treatment by lying or withholding information, he advised that the physician do as he thought best.

When the American Medical Association was founded they in 1847 produced a work called the first edition of the American Medical Association Code of Medical Ethics. Many sections of this book are verbatim copies of passages from Percival's Medical Ethics. A new concept in this book was the idea that physicians should fully disclose all patient details truthfully when talking to other physicians, but the text does not also apply this idea to disclosing information to patients. Through this text, Percival's ideas became pervasive guidelines throughout the United States as other texts were derived from them.

Worthington Hooker was an American physician who in 1849 published Physician and Patient. This medical ethics book was radical demonstrating understanding of the AMA's guidelines and Percival's philosophy and soundly rejecting all directives that a doctor should lie to patients. In Hooker's view, benevolent deception is not fair to the patient, and he lectured widely on this topic. Hooker's ideas were not broadly influential.

Research history

Historians cite a series of human subject research experiments to trace the history of informed consent in research.

The U.S. Army Yellow Fever Commission "is considered the first research group in history to use consent forms." In 1900, Major Walter Reed was appointed head of the four man U.S. Army Yellow Fever Commission in Cuba that determined mosquitoes were the vector for yellow fever transmission. His earliest experiments were probably done without formal documentation of informed consent. In later experiments he obtained support from appropriate military and administrative authorities. He then drafted what is now "one of the oldest series of extant informed consent documents." The three surviving examples are in Spanish with English translations; two have an individual's signature and one is marked with an X.

Tearoom Trade is the name of a book by American psychologist Laud Humphreys. In it he describes his research into male homosexual acts. In conducting this research he never sought consent from his research subjects and other researchers raised concerns that he violated the right to privacy for research participants.

Henrietta Lacks On Jan. 29, 1951, shortly after the birth of her son Joseph, Lacks entered Johns Hopkins Hospital in Baltimore with profuse bleeding. She was diagnosed with cervical cancer and was treated with inserts of radium tubes. During her radiation treatments for the tumor, two samples—one of healthy cells, the other of malignant cells—were removed from her cervix without her permission. Later that year, 31-year-old Henrietta Lacks succumbed to the cancer. Her cells were cultured creating Hela cells, but the family was not informed until 1973, the family learned the truth when scientists asked for DNA samples after finding that HeLa had contaminated other samples. In 2013 researchers published the genome without the Lacks family consent.

The Milgram experiment is the name of a 1961 experiment conducted by American psychologist Stanley Milgram. In the experiment Milgram had an authority figure order research participants to commit a disturbing act of harming another person. After the experiment he would reveal that he had deceived the participants and that they had not hurt anyone, but the research participants were upset at the experience of having participated in the research. The experiment raised broad discussion on the ethics of recruiting participants for research without giving them full information about the nature of the research.

Chester M. Southam used HeLa cells to inject into cancer patients and Ohio State Penitentiary inmates without informed consent to determine if people could become immune to cancer and if cancer could be transmitted.

Medical procedures

The doctrine of informed consent relates to professional negligence and establishes a breach of the duty of care owed to the patient (see duty of care, breach of the duty, and respect for persons). The doctrine of informed consent also has significant implications for medical trials of medications, devices, or procedures.

Requirements of the professional

Until 2015 in the United Kingdom and in countries such as Malaysia and Singapore, informed consent in medical procedures requires proof as to the standard of care to expect as a recognised standard of acceptable professional practice (the Bolam Test), that is, what risks would a medical professional usually disclose in the circumstances (see Loss of right in English law). Arguably, this is "sufficient consent" rather than "informed consent." The UK has since departed from the Bolam test for judging standards of informed consent, due to the landmark ruling in Montgomery v Lanarkshire Health Board. This moves away from the concept of a reasonable physician and instead uses the standard of a reasonable patient, and what risks an individual would attach significance to.

Medicine in the United States, Australia, and Canada also takes this patient-centric approach to "informed consent." Informed consent in these jurisdictions requires healthcare providers to disclose significant risks, as well as risks of particular importance to that patient. This approach combines an objective (a hypothetical reasonable patient) and subjective (this particular patient) approach.

The doctrine of informed consent should be contrasted with the general doctrine of medical consent, which applies to assault or battery. The consent standard here is only that the person understands, in general terms, the nature of and purpose of the intended intervention. As the higher standard of informed consent applies to negligence, not battery, the other elements of negligence must be made out. Significantly, causation must be shown: That had the individual been made aware of the risk he would not have proceeded with the operation (or perhaps with that surgeon).

Optimal establishment of an informed consent requires adaptation to cultural or other individual factors of the patient. For example, people from Mediterranean and Arab appear to rely more on the context of the delivery of the information, with the information being carried more by who is saying it and where, when, and how it is being said, rather than what is said, which is of relatively more importance in typical "Western" countries.

The informed consent doctrine is generally implemented through good healthcare practice: pre-operation discussions with patients and the use of medical consent forms in hospitals. However, reliance on a signed form should not undermine the basis of the doctrine in giving the patient an opportunity to weigh and respond to the risk. In one British case, a doctor performing routine surgery on a woman noticed that she had cancerous tissue in her womb. He took the initiative to remove the woman's womb; however, as she had not given informed consent for this operation, the doctor was judged by the General Medical Council to have acted negligently. The council stated that the woman should have been informed of her condition, and allowed to make her own decision.

Obtaining informed consents

To document that informed consent has been given for a procedure, healthcare organisations have traditionally used paper-based consent forms on which the procedure and its risks and benefits are noted, and is signed by both patient and clinician. In a number of healthcare organisations consent forms are scanned and maintained in an electronic document store. The paper consent process has been demonstrated to be associated with significant errors of omission, and therefore increasing numbers of organisations are using digital consent applications where the risk of errors can be minimised, a patient's decision making and comprehension can be supported by additional lay-friendly and accessible information, consent can be completed remotely, and the process can become paperless. One form of digital consent is dynamic consent, which invites participants to provide consent in a granular way, and makes it easier for them to withdraw consent if they wish.

Electronic consent methods have been used to support indexing and retrieval of consent data, thus enhancing the ability to honor to patient intent and identify willing research participants. More recently, Health Sciences South Carolina, a statewide research collaborative focused on transforming healthcare quality, health information systems and patient outcomes, developed an open-source system called Research Permissions Management System (RPMS).

Competency of the patient

The ability to give informed consent is governed by a general requirement of competency. In common law jurisdictions, adults are presumed competent to consent. This presumption can be rebutted, for instance, in circumstances of mental illness or other incompetence. This may be prescribed in legislation or based on a common-law standard of inability to understand the nature of the procedure. In cases of incompetent adults, a health care proxy makes medical decisions. In the absence of a proxy, the medical practitioner is expected to act in the patient's best interests until a proxy can be found.

By contrast, 'minors' (which may be defined differently in different jurisdictions) are generally presumed incompetent to consent, but depending on their age and other factors may be required to provide Informed assent. In some jurisdictions (e.g. much of the U.S.), this is a strict standard. In other jurisdictions (e.g. England, Australia, Canada), this presumption may be rebutted through proof that the minor is 'mature' (the 'Gillick standard'). In cases of incompetent minors, informed consent is usually required from the parent (rather than the 'best interests standard') although a parens patriae order may apply, allowing the court to dispense with parental consent in cases of refusal.

Deception

Research involving deception is controversial given the requirement for informed consent. Deception typically arises in social psychology, when researching a particular psychological process requires that investigators deceive subjects. For example, in the Milgram experiment, researchers wanted to determine the willingness of participants to obey authority figures despite their personal conscientious objections. They had authority figures demand that participants deliver what they thought was an electric shock to another research participant. For the study to succeed, it was necessary to deceive the participants so they believed that the subject was a peer and that their electric shocks caused the peer actual pain.

Nonetheless, research involving deception prevents subjects from exercising their basic right of autonomous informed decision-making and conflicts with the ethical principle of respect for persons.

The Ethical Principles of Psychologists and Code of Conduct set by the American Psychological Association says that psychologists may conduct research that includes a deceptive compartment only if they can both justify the act by the value and importance of the study's results and show they could not obtain the results by some other way. Moreover, the research should bear no potential harm to the subject as an outcome of deception, either physical pain or emotional distress. Finally, the code requires a debriefing session in which the experimenter both tells the subject about the deception and gives subject the option of withdrawing the data.

Abortion

In some U.S. states, informed consent laws (sometimes called "right to know" laws) require that a woman seeking an elective abortion receive information from the abortion provider about her legal rights, alternatives to abortion (such as adoption), available public and private assistance, and other information specified in the law, before the abortion is performed. Other countries with such laws (e.g. Germany) require that the information giver be properly certified to make sure that no abortion is carried out for the financial gain of the abortion provider and to ensure that the decision to have an abortion is not swayed by any form of incentive.

Some informed consent laws have been criticized for allegedly using "loaded language in an apparently deliberate attempt to 'personify' the fetus," but those critics acknowledge that "most of the information in the [legally mandated] materials about abortion comports with recent scientific findings and the principles of informed consent", although "some content is either misleading or altogether incorrect."

From children

As children often lack the decision making ability or legal power (competence) to provide true informed consent for medical decisions, it often falls on parents or legal guardians to provide informed permission for medical decisions. This "consent by proxy" usually works reasonably well, but can lead to ethical dilemmas when the judgment of the parents or guardians and the medical professional differ with regard to what constitutes appropriate decisions "in the best interest of the child". Children who are legally emancipated, and certain situations such as decisions regarding sexually transmitted diseases or pregnancy, or for unemancipated minors who are deemed to have medical decision making capacity, may be able to provide consent without the need for parental permission depending on the laws of the jurisdiction the child lives in. The American Academy of Pediatrics encourages medical professionals also to seek the assent of older children and adolescents by providing age appropriate information to these children to help empower them in the decision making process.

Research on children has benefited society in many ways. The only effective way to establish normal patterns of growth and metabolism is to do research on infants and young children. When addressing the issue of informed consent with children, the primary response is parental consent. This is valid, although only legal guardians are able to consent for a child, not adult siblings. Additionally, parents may not order the termination of a treatment that is required to keep a child alive, even if they feel it is in the best interest. Guardians are typically involved in the consent of children, however a number of doctrines have developed that allow children to receive health treatments without parental consent. For example, emancipated minors may consent to medical treatment, and minors can also consent in an emergency.

Consent to research

Informed consent is part of the ethical clinical research as well, in which a human subject voluntarily confirms his or her willingness to participate in a particular clinical trial, after having been informed of all aspects of the trial that are relevant to the subject's decision to participate. Informed consent is documented by means of a written, signed, and dated informed consent form. In medical research, the Nuremberg Code set a base international standard in 1947, which continued to develop, for example in response to the ethical violation in the Holocaust. Nowadays, medical research is overseen by an ethics committee that also oversees the informed consent process.

As the medical guidelines established in the Nuremberg Code were imported into the ethical guidelines for the social sciences, informed consent became a common part of the research procedure. However, while informed consent is the default in medical settings, it is not always required in the social science. Here, research often involves low or no risk for participants, unlike in many medical experiments. Second, the mere knowledge that they participate in a study can cause people to alter their behavior, as in the Hawthorne Effect: "In the typical lab experiment, subjects enter an environment in which they are keenly aware that their behavior is being monitored, recorded, and subsequently scrutinized." In such cases, seeking informed consent directly interferes with the ability to conduct the research, because the very act of revealing that a study is being conducted is likely to alter the behavior studied. List exemplifies the potential dilemma that can result: "if one were interested in exploring whether, and to what extent, race or gender influences the prices that buyers pay for used cars, it would be difficult to measure accurately the degree of discrimination among used car dealers who know that they are taking part in an experiment." In cases where such interference is likely, and after careful consideration, a researcher may forgo the informed consent process. This is commonly done after weighting the risk to study participants versus the benefit to society and whether participants are present in the study out of their own wish and treated fairly. Researchers often consult with an ethics committee or institutional review board to render a decision.

The birth of new online media, such as social media, has complicated the idea of informed consent. In an online environment people pay little attention to Terms of Use agreements and can subject themselves to research without thorough knowledge. This issue came to the public light following a study conducted by Facebook Inc. in 2014, and published by that company and Cornell University. Facebook conducted a study where they altered the Facebook News Feeds of roughly 700,000 users to reduce either the amount of positive or negative posts they saw for a week. The study then analyzed if the users status updates changed during the different conditions. The study was published in the Proceedings of the National Academy of Sciences.

The lack of informed consent led to outrage among many researchers and users. Many believed that by potentially altering the mood of users by altering what posts they see, Facebook put at-risk individuals at higher dangers for depression and suicide. However, supports of Facebook claim that Facebook details that they have the right to use information for research in their terms of use. Others say the experiment is just a part of Facebook's current work, which alters News Feeds algorithms continually to keep people interested and coming back to the site. Others pointed out that this specific study is not along but that news organizations constantly try out different headlines using algorithms to elicit emotions and garner clicks or Facebook shares. They say this Facebook study is no different from things people already accept. Still, others say that Facebook broke the law when conducting the experiment on user that didn't give informed consent.

The Facebook study controversy raises numerous questions about informed consent and the differences in the ethical review process between publicly and privately funded research. Some say Facebook was within its limits and others see the need for more informed consent and/or the establishment of in-house private review boards.

Conflicts of interest

Other, long-standing controversies underscore the role for conflicts of interest among medical school faculty and researchers. For example, coverage of University of California (UC) medical school faculty members has included news of ongoing corporate payments to researchers and practitioners from companies that market and produce the very devices and treatments they recommend to patients. Robert Pedowitz, the former chairman of UCLA's orthopedic surgery department, reported concern that his colleague's financial conflicts of interest could negatively affect patient care or research into new treatments. In a subsequent lawsuit about whistleblower retaliation, the university provided a $10 million settlement to Pedowitz while acknowledging no wrongdoing. Consumer Watchdog, an oversight group, observed that University of California policies were "either inadequate or unenforced...Patients in UC hospitals deserve the most reliable surgical devices and medication…and they shouldn't be treated as subjects in expensive experiments." Other UC incidents include taking the eggs of women for implantation into other women without consent and injecting live bacteria into human brains, resulting in potentially premature deaths.

History of human migration

From Wikipedia, the free encyclopedia
Refugees seeking asylum in Greece

Human migration is the movement by people from one place to another, particularly different countries, with the intention of settling temporarily or permanently in the new location. It typically involves movements over long distances and from one country or region to another.

Historically, early human migration includes the peopling of the world, i.e. migration to world regions where there was previously no human habitation, during the Upper Paleolithic. Since the Neolithic, most migrations (except for the peopling of remote regions such as the Arctic or the Pacific), were predominantly warlike, consisting of conquest or Landnahme on the part of expanding populations. Colonialism involves expansion of sedentary populations into previously only sparsely settled territories or territories with no permanent settlements. In the modern period, human migration has primarily taken the form of migration within and between existing sovereign states, either controlled (legal immigration) or uncontrolled and in violation of immigration laws (illegal immigration).

Migration can be voluntary or involuntary. Involuntary migration includes forced displacement (in various forms such as deportation, slave trade, trafficking in human beings) and flight (war refugees, ethnic cleansing), both resulting in the creation of diasporas.

Pre-modern history

Studies show that the pre-modern migration of human populations begins with the movement of Homo erectus out of Africa across Eurasia about 1.75 million years ago. Homo sapiens appears to have occupied all of Africa about 150,000 years ago; some members of this species moved out of Africa 70,000 years ago (or, according to more recent studies, as early as 125,000 years ago into Asia, and even as early as 270,000 years ago). It is suggested that modern non-African populations descend mostly from a later migration out of Africa between 70,000 and 50,000 years ago, which spread across Australia, Asia and Europe by 40,000 BCE. Migration to the Americas took place 20,000 to 15,000 years ago. By 2000 years ago humans had established settlements in most of the Pacific Islands. Major population-movements notably include those postulated as associated with the Neolithic Revolution and with Indo-European expansion. The Early Medieval Great Migrations including Turkic expansion have left significant traces. In some places, such as Turkey and Azerbaijan, there was a substantial cultural transformation after the migration of relatively small elite populations. Historians see elite-migration parallels in the Roman and Norman conquests of Britain, while "the most hotly debated of all the British cultural transitions is the role of migration in the relatively sudden and drastic change from Romano-Britain to Anglo-Saxon Britain", which may be explained by a possible "substantial migration of Anglo-Saxon Y chromosomes into Central England (contributing 50%–100% to the gene pool at that time)."

Chronological dispersal of Austronesian people across the Indo-Pacific

Early humans migrated due to many factors, such as changing climate and landscape and inadequate food-supply for the levels of population. The evidence indicates that the ancestors of the Austronesian peoples spread from the South Chinese mainland to the island of Taiwan around 8,000 years ago. Evidence from historical linguistics suggests that seafaring peoples migrated from Taiwan, perhaps in distinct waves separated by millennia, to the entire region encompassed by the Austronesian languages. Scholars believe that this migration began around 6,000 years ago. Indo-Aryan migration from the Indus Valley to the plain of the River Ganges in Northern India is presumed to have taken place in the Middle to Late Bronze Age, contemporary with the Late Harappan phase in India (around 1700 to 1300 BCE). From 180 BCE a series of invasions from Central Asia followed in the northwestern Indian subcontinent, including those led by the Indo-Greeks, Indo-Scythians, Indo-Parthians and Kushans.

From 728 BCE, the Greeks began 250 years of expansion, settling colonies in several places, including Sicily and Marseille. Classical-era Europe provides evidence of two major migration movements: the Celtic peoples in the first millennium BCE, and the later Migration Period of the first millennium CE from the North and East. Both may be examples of general cultural change sparked by primarily elite and warrior migration. A smaller migration (or sub-migration) involved the Magyars moving into Pannonia (modern-day Hungary) in the 9th century CE. Turkic peoples spread from their homeland in modern Turkestan across most of Central Asia into Europe and the Middle East between the 6th and 11th centuries CE. Recent research suggests that Madagascar was uninhabited until Austronesian seafarers from present-day Indonesia arrived during the 5th and 6th centuries CE. Subsequent migrations both from the Pacific and from Africa further consolidated this original mixture, and Malagasy people emerged.

4th to 6th century Migration Period

Before the expansion of the Bantu languages and their speakers, the southern half of Africa is believed to have been populated by Pygmies and Khoisan-speaking people, whose descendants today occupy the arid regions around the Kalahari Desert and the forests of Central Africa. By about 1000 CE Bantu migration had reached modern-day Zimbabwe and South Africa. The Banu Hilal and Banu Ma'qil, a collection of Arab Bedouin tribes from the Arabian Peninsula, migrated westwards via Egypt between the 11th and 13th centuries. Their migration strongly contributed to the Arabisation and Islamisation of the western Maghreb, until then dominated by Berber tribes. Ostsiedlung was the medieval eastward migration and settlement of Germans - following in the footsteps of East Germanic Goths and North Germanic Varangians. The 13th century was the time of the great Mongol and Turkic migrations across Eurasia, where the Eurasian steppe has time and again provided a ready migration-path - for (for example) Huns, Bulgars, Tatars and Slavs.

Between the 11th and 18th centuries, numerous migrations took place in Asia. The Vatsayan Priests migrated from the eastern Himalaya hills to Kashmir during the Shan invasion in the 13th century. They settled in the lower Shivalik Hills in the 13th century to sanctify the manifest goddess.[clarification needed] In the Ming occupation, the Vietnamese started expanding southward in the 11th century; this is known in Vietnamese as nam tiến (southward expansion). The early Qing Dynasty (founded in 1636) separated Manchuria from China proper with the Inner Willow Palisade, which restricted the movement of the Han Chinese into Manchuria, as the area was off-limits (British English: out of bounds) to the Han until the Qing started colonizing the area with them (late 18th century) later on in the dynasty's rule.

The Age of Exploration and European colonialism has led to an accelerated pace of migration since Early Modern times. In the 16th century, perhaps 240,000 Europeans entered American ports. In the 19th century over 50 million people left Europe for the Americas alone. The local populations or tribes, such as the Aboriginal people in Canada, Brazil, Argentina, Australia, and the United States, were often numerically overwhelmed by incoming settlers and by those settlers' indentured laborers and imported slaves.

Modern history

Industrialization

Factory chimney releasing gas into the blue sky.

When the pace of migration had accelerated since the 18th century already (including the involuntary slave trade), it would increase further in the 19th century. Manning distinguishes three major types of migration: labor migration, refugee migrations, and urbanization. Millions of agricultural workers left the countryside and moved to the cities causing unprecedented levels of urbanization. This phenomenon began in Britain in the late 18th century and spread around the world and continues to this day in many areas.

Industrialization encouraged migration wherever it appeared. The increasingly global economy globalized the labor market. The Atlantic slave trade diminished sharply after 1820, which gave rise to self-bound contract labor migration from Europe and Asia to plantations. Overcrowding, open agricultural frontiers, and rising industrial centers attracted voluntary migrants. Moreover, migration was significantly made easier by improved transportation techniques.

Romantic nationalism also rose in the 19th century, and, with it, ethnocentrism. The great European industrial empires also rose. Both factors contributed to migration, as some countries favored their own ethnicity over outsiders and other countries appeared to be considerably more welcoming. For example, the Russian Empire identified with Eastern Orthodoxy, and confined Jews, who were not Eastern Orthodox, to the Pale of Settlement and imposed restrictions. Violence was also a problem. The United States was promoted as a better location, a "golden land" where Jews could live more openly. Another effect of imperialism, colonialism, led to the migration of some colonizing parties from "home countries" to "the colonies", and eventually the migration of people from "colonies" to "home countries".

Transnational labor migration reached a peak of three million migrants per year in the early twentieth century. Italy, Norway, Ireland and the Guangdong region of China were regions with especially high emigration rates during these years. These large migration flows influenced the process of nation state formation in many ways. Immigration restrictions have been developed, as well as diaspora cultures and myths that reflect the importance of migration to the foundation of certain nations, like the American melting pot. The transnational labor migration fell to a lower level from the 1930s to the 1960s and then rebounded.

The United States experienced considerable internal migration related to industrialization, including its African American population. From 1910 to 1970, approximately 7 million African Americans migrated from the rural Southern United States, where black people faced both poor economic opportunities and considerable political and social prejudice, to the industrial cities of the Northeast, Midwest and West, where relatively well-paid jobs were available. This phenomenon came to be known in the United States as its own Great Migration, although historians today consider the migration to have two distinct phases. The term "Great Migration", without a qualifier, is now most often used to refer the first phase, which ended roughly at the time of the Great Depression. The second phase, lasting roughly from the start of U.S. involvement in World War II to 1970, is now called the Second Great Migration. With the demise of legalised segregation in the 1960s and greatly improved economic opportunities in the South in the subsequent decades, millions of blacks have returned to the South from other parts of the country since 1980 in what has been called the New Great Migration.

World wars and aftermath

Swiss woman and her children leaving Civil war in Russia, around 1921

The First and Second World Wars, and wars, genocides, and crises sparked by them, had an enormous impact on migration. Muslims moved from the Balkan to Turkey, while Christians moved the other way, during the collapse of the Ottoman Empire. In April 1915 the Ottoman government embarked upon the systematic decimation of its civilian Armenian population. The persecutions continued with varying intensity until 1923 when the Ottoman Empire ceased to exist and was replaced by the Republic of Turkey. The Armenian population of the Ottoman state was reported at about two million in 1915. An estimated one million had perished by 1918, while hundreds of thousands had become homeless and stateless refugees. By 1923 virtually the entire Armenian population of Anatolian Turkey had disappeared. Four hundred thousand Jews had already moved to Palestine in the early twentieth century, and numerous Jews to America, as already mentioned. The Russian Civil War caused some three million Russians, Poles, and Germans to migrate out of the new Soviet Union. Decolonization following the Second World War also caused migrations.

The Jewish communities across Europe, the Mediterranean and the Middle East were formed from voluntary and involuntary migrants. After the Holocaust (1938 to 1945), there was increased migration to the British Mandate of Palestine, which became the modern state of Israel as a result of the United Nations Partition Plan for Palestine.

Provisions of the Potsdam Agreement from 1945 signed by victorious Western Allies and the Soviet Union led to one of the largest European migrations, and the largest in the 20th century. It involved the migration and resettlement of close to or over 20 million people. The largest affected group were 16.5 million Germans expelled from Eastern Europe westwards. The second largest group were Poles, millions of whom were expelled westwards from eastern Kresy region and resettled in the so-called Recovered Territories (see Allies decide Polish border in the article on the Oder-Neisse line). Hundreds of thousands of Poles, Ukrainians (Operation Vistula), Lithuanians, Latvians, Estonians and some Belarusians were expelled eastwards from Europe to the Soviet Union. Finally, many of the several hundred thousand Jews remaining in Eastern Europe after the Holocaust migrated outside Europe to Israel and the United States.

Partition of India

In 1947, upon the Partition of India, large populations moved from India to Pakistan and vice versa, depending on their religious beliefs. The partition was created by the Indian Independence Act 1947 as a result of the dissolution of the British Indian Empire. The partition displaced up to 17 million people in the former British Indian Empire, with estimates of loss of life varying from several hundred thousand to a million. Muslim residents of the former British India migrated to Pakistan (including East Pakistan, now Bangladesh), whilst Hindu and Sikh residents of Pakistan and Hindu residents of East Pakistan (now Bangladesh) moved in the opposite direction.

In modern India, estimates based on industry sectors mainly employing migrants suggest that there are around 100 million circular migrants in India. Caste, social networks and historical precedents play a powerful role in shaping patterns of migration.

Research by the Overseas Development Institute identifies a rapid movement of labor from slower- to faster-growing parts of the economy. Migrants can often find themselves excluded by urban housing policies, and migrant support initiatives are needed to give workers improved access to market information, certification of identity, housing and education.

In the riots which preceded the partition in the Punjab region, between 200,000 and 500,000 people were killed in the retributive genocide. U.N.H.C.R. estimates 14 million Hindus, Sikhs and Muslims were displaced during the partition. Scholars call it the largest mass migration in human history: Nigel Smith, in his book Pakistan: History, Culture, and Government, calls it "history's greatest migration."

Contemporary history (1960s to present)

Introduction to entropy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...