headers
sequencelengths
0
4
text
stringlengths
54
5.14k
id
stringclasses
129 values
title
stringclasses
129 values
categories
sequencelengths
0
83
seealso
sequencelengths
0
35
[ "Strategy and commanders", "British strategy" ]
The British military had considerable experience of fighting in North America, most recently during the Seven Years' War which forced France to give up [[New France]] in 1763. However, in previous conflicts they benefited from local logistics, as well as support from the colonial militia, which was not available in the American Revolutionary War. Reinforcements had to come from Europe, and maintaining large armies over such distances was extremely complex; ships could take three months to cross the Atlantic, and orders from London were often outdated by the time they arrived. Prior to the conflict, the colonies were largely autonomous economic and political entities, with no centralized area of ultimate strategic importance. This meant that, unlike Europe where the fall of a capital city often ended wars, that in America continued even after the loss of major settlements such as Philadelphia, the seat of Congress, New York and Charleston. British power was reliant on the Royal Navy, whose dominance allowed them to resupply their own expeditionary forces while preventing access to enemy ports. However, the majority of the American population was agrarian, rather than urban; supported by the French navy and blockade runners based in the [[Dutch Caribbean]], their economy was able to survive. The geographical size of the colonies and limited manpower meant the British could not simultaneously conduct military operations and occupy territory without local support. Debate persists over whether their defeat was inevitable; one British statesman described it as "like trying to conquer a map". While [[John E. Ferling|Ferling]] argues Patriot victory was nothing short of a miracle, [[Joseph Ellis|Ellis]] suggests the odds always favored the Americans, especially after Howe squandered the chance of a decisive British success in 1776, an "opportunity that would never come again". The US military history speculates the additional commitment of 10,000 fresh troops in 1780 would have placed British victory "within the realm of possibility".
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Strategy and commanders", "British strategy", "British Army" ]
The expulsion of France from North America in 1763 led to a drastic reduction in British troop levels in the colonies; in 1775, there were only 8,500 regular soldiers among a civilian population of 2.8 million. The bulk of military resources in the Americas were focused on defending sugar islands in the Caribbean; [[Colony of Jamaica|Jamaica]] alone generated more revenue than all thirteen American colonies combined. With the end of the Seven Years' War, the permanent army in Britain was also cut back, which resulted in administrative difficulties when the war began a decade later. Over the course of the war, there were four separate British commanders-in-chief, the first of whom was Thomas Gage; appointed in 1763, his initial focus was establishing British rule in former French areas of Canada. Rightly or wrongly, many in London blamed the revolt on his failure to take firm action earlier, and he was relieved after the heavy losses incurred at Bunker Hill. His replacement was Sir William Howe, a member of the Whig faction in Parliament who opposed the policy of coercion advocated by Lord North; Cornwallis, who later surrendered at Yorktown, was one of many senior officers who initially refused to serve in North America. The 1775 campaign showed the British overestimated the capabilities of their own troops and underestimated the colonial militia, requiring a reassessment of tactics and strategy. However, it allowed the Patriots to take the initiative and British authorities rapidly lost control over every colony. Howe's responsibility is still debated; despite receiving large numbers of reinforcements, Bunker Hill seems to have permanently affected his self-confidence and lack of tactical flexibility meant he often failed to follow up opportunities. Many of his decisions were attributed to supply problems, such as the delay in launching the New York campaign and failure to pursue Washington's beaten army. Having lost the confidence of his subordinates, he was recalled after Burgoyne surrendered at Saratoga. Following the failure of the Carlisle Commission, British policy changed from treating the Patriots as subjects who needed to be reconciled to enemies who had to be defeated. In 1778, Howe was replaced by Sir Henry Clinton, appointed instead of Carleton who was considered overly cautious. Regarded as an expert on tactics and strategy, like his predecessors Clinton was handicapped by chronic supply issues. As a result, he was largely inactive in 1779 and much of 1780; in October 1780, he warned Germain of "fatal consequences" if matters did not improve. In addition, Clinton's strategy was compromised by conflict with political superiors in London and his colleagues in North America, especially Admiral [[Mariot Arbuthnot]], replaced in early 1781 by Rodney. He was neither notified nor consulted when Germain approved Cornwallis' invasion of the south in 1781 and delayed sending him reinforcements believing the bulk of Washington's army was still outside New York City. After the surrender at Yorktown, Clinton was relieved by Carleton, whose major task was to oversee the evacuation of Loyalists and British troops from Savannah, Charleston, and New York City.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Strategy and commanders", "British strategy", "German Troops" ]
During the 18th century, all states commonly hired foreign soldiers, especially Britain; during the Seven Years' War, they comprised 10% of the British army and their use caused little debate. When it became clear additional troops were needed to suppress the revolt in America, it was decided to employ mercenaries. There were several reasons for this, including public sympathy for the Patriot cause, an historical reluctance to expand the British army and the time needed to recruit and train new regiments. An alternate source was readily available in the [[Holy Roman Empire]], where many smaller states had a long tradition of renting their armies to the highest bidder. The most important was [[Landgraviate of Hesse-Kassel|Hesse-Cassel]], known as "the Mercenary State". The first supply agreements were signed by the North administration in late 1775; over the next decade, more than 40,000 Germans fought in North America, Gibraltar, South Africa and India, of whom 30,000 served in the American War. Often generically referred to as "Hessians", they included men from many other states, including [[Electorate of Brunswick-Lüneburg|Hanover]] and [[Electorate of Brunswick-Lüneburg|Brunswick]]. Sir Henry Clinton recommended recruiting Russian troops whom he rated very highly, having seen them in action against the [[Russo-Turkish War (1768–1774)|Ottomans]]; however, negotiations with [[Catherine the Great]] made little progress. Unlike previous wars their use led to intense political debate in Britain, France, and even Germany, where [[Frederick William II of Prussia|Frederick the Great]] refused to provide passage through his territories for troops hired for the American war. In March 1776, the agreements were challenged in Parliament by Whigs who objected to "coercion" in general, and the use of foreign soldiers to subdue "British subjects". The debates were covered in detail by American newspapers, which reprinted key speeches and in May 1776 they received copies of the treaties themselves. Provided by British sympathizers, these were smuggled into North America from London by George Merchant, a recently released American prisoner. The prospect of mercenaries being used in the colonies bolstered support for independence, more so than taxation and other acts combined; the King was accused of declaring war on his own subjects, leading to the idea there were now two separate governments. By apparently showing Britain was determined to go to war, it made hopes of reconciliation seem naive and hopeless, while the employment of 'foreign mercenaries' became one of the charges levelled against George III in the Declaration of Independence. The Hessian reputation within Germany for brutality also increased support for the Patriot cause among German-American immigrants.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Strategy and commanders", "British strategy", "German Troops" ]
The presence of over 150,000 German-Americans meant both sides felt these mercenaries might be persuaded to desert; one reason Clinton suggested employing Russians was that he felt they were less likely to defect. When the first German troops arrived on [[Staten Island]] in August 1776, Congress approved the printing of "handbills" promising land and citizenship to any willing to join the Patriot cause. The British launched a counter-campaign claiming deserters could well be executed for meddling in a war that was not theirs. Desertion among the Germans occurred throughout the war, with the highest rate of desertion occurring during the time between the surrender at Yorktown and the Treaty of Paris. German regiments were central to the British war effort; of the estimated 30,000 sent to America, some 13,000 became casualties.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Revolution as civil war", "Loyalists" ]
Wealthy Loyalists convinced the British government that most of the colonists were sympathetic toward the Crown; consequently, British military planners relied on recruiting Loyalists, but had trouble recruiting sufficient numbers as the Patriots had widespread support. Nevertheless, they continued to deceive themselves on their level of American support as late as 1780, a year before hostilities ended. Approximately 25,000 Loyalists fought for the British throughout the war. Although Loyalists constituted about twenty percent of the colonial population, they were concentrated in distinct communities. Many of them lived among large plantation owners in the [[Tidewater (region)|Tidewater region]] and [[South Carolina in the American Revolution#Early conflicts|South Carolina]] who produced cash crops in tobacco and indigo comparable to global markets in Caribbean sugar. When the British began probing the backcountry in 1777–1778, they were faced with a major problem: any significant level of organized Loyalist activity required a continued presence of British regulars. The available manpower that the British had in America was insufficient to protect Loyalist territory and counter American offensives. The Loyalist militias in the South were constantly defeated by neighboring Patriot militia. The most critical combat between the two partisan militias was at the [[Battle of Kings Mountain]]; the Patriot victory irreversibly crippled any further Loyalist militia capability in the South. When the early war policy was administered by General [[William Howe, 5th Viscount Howe|William Howe]], the Crown's need to maintain Loyalist support prevented it from using the traditional revolt suppression methods. The British cause suffered when their troops ransacked local homes during an aborted attack on Charleston in 1779 that enraged both Patriots and Loyalists. After Congress rejected the [[Carlisle Peace Commission]] in 1778 and Westminster turned to "hard war" during Clinton's command, neutral colonists in the Carolinas often allied with the Patriots whenever brutal combat broke out between Tories and Whigs. Conversely, Loyalists gained support when Patriots intimidated suspected Tories by destroying property or [[tarring and feathering]]. A Loyalist militia unit—the [[British Legion (American Revolution)|British Legion]]—provided some of the best troops in British service that it received a commission in the British Army: it was a mixed regiment of 250 [[dragoon]] and 200 infantry supported by batteries of flying artillery. It was commanded by [[Banastre Tarleton]] and gained a fearsome reputation in the colonies for "brutality and needless slaughter". In May 1779 the British Legion was one of five regiments that formed the [[American establishment (British army)|American Establishment]].
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Revolution as civil war", "Women" ]
Women played various roles during the Revolutionary War; they often accompanied their husbands when permitted to do so. For example, throughout the war [[Martha Washington]] was known to visit and provide aid to her husband George at various American camps, and [[Frederika Charlotte Riedesel]] documented the [[Saratoga campaign]]. Women often accompanied armies as [[camp follower]] to sell goods and perform necessary tasks in hospitals and camps. They were a necessary part of eighteenth-century armies, and numbered in the thousands during the war. Women also assumed military roles: aside from auxiliary tasks like treating the wounded or setting up camp, some dressed as men to directly support combat, fight, or act as spies on both sides of the Revolutionary War. Anna Maria Lane joined her husband in the Army and wore men's clothes by the time the [[Battle of Germantown]] happened. The Virginia General Assembly later cited her bravery: she fought while dressed as a man and "performed extraordinary military services, and received a severe wound at the battle of Germantown ... with the courage of a soldier". On April 26, 1777, [[Sybil Ludington]] rode to alert militia forces of Putnam County, New York, and Danbury, Connecticut, to warn them of the British's approach; she has been called the "female Paul Revere". A few others [[List of wartime cross-dressers|disguised themselves as men]]. [[Deborah Sampson]] fought until her gender was discovered and discharged as a result; [[Sally St. Clair]] was killed in action during the war.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Revolution as civil war", "African Americans" ]
When war began, the population of the Thirteen Colonies included an estimated 500,000 slaves, predominantly used as labor on [[Plantation complexes in the Southern United States|Southern plantations]]. In November 1775, [[John Murray, 4th Earl of Dunmore|Lord Dunmore]], the Royal Governor of Virginia, issued a [[Dunmore's Proclamation|proclamation]] that promised freedom to any Patriot-owned slaves willing to bear arms. Although the announcement helped to fill a temporary manpower shortage, white Loyalist prejudice meant recruits were eventually redirected to non-combatant roles. The Loyalists' motive was to deprive Patriot [[Planter class|planters]] of labor rather than to end slavery; Loyalist-owned slaves were returned. The 1779 [[Philipsburg Proclamation]] issued by Clinton extended the offer of freedom to Patriot-owned slaves throughout the colonies. It persuaded entire families to escape to British lines, many of which were employed on farms to grow food for the army by removing the requirement for military service. While Clinton organized the [[Black Pioneers]], he also ensured fugitive slaves were returned to Loyalist owners with orders that they were not to be punished for their attempted escape. As the war progressed, service as regular soldiers in British units became increasingly common; black Loyalists formed two regiments of the Charleston garrison in 1783. Estimates of the numbers who served the British during the war vary from 25,000 to 50,000, excluding those who escaped during wartime. Thomas Jefferson estimated that Virginia may have lost 30,000 slaves in total escapes. In South Carolina, nearly 25,000 slaves (about 30 percent of the enslaved population) either fled, migrated, or died, which significantly disrupted the plantation economies both during and after the war. [[Black Patriot]] were barred from the Continental Army until Washington convinced Congress in January 1778 that there was no other way to replace losses from disease and desertion. The [[1st Rhode Island Regiment]] formed in February included former slaves whose owners were compensated; however, only 140 of its 225 soldiers were black and recruitment stopped in June 1788. Ultimately, around 5,000 African-Americans served in the Continental Army and Navy in a variety of roles, while another 4,000 were employed in Patriot militia units, aboard privateers, or as teamsters, servants, and spies. After the war, a small minority received land grants or Congressional pensions in old age; many others were returned to their masters post-war despite earlier promises of freedom. As a Patriot victory became increasingly likely, the treatment of Black Loyalists became a point of contention; after the surrender of Yorktown in 1781, Washington insisted all escapees be returned but Cornwallis refused. In 1782 and 1783, around 8,000 to 10,000 freed blacks were evacuated by the British from Charleston, Savannah, and New York; some moved onto London, while 3,000 to 4,000 settled in [[Nova Scotia]], where they founded settlements such as [[Birchtown, Nova Scotia|Birchtown]]. White Loyalists transported 15,000 enslaved blacks to [[Colony of Jamaica|Jamaica]] and the [[Bahamas]]. The free Black Loyalists who migrated to the [[British West Indies]] included regular soldiers from Dunmore's [[Ethiopian Regiment]], and those from Charleston who helped garrison the [[Leeward Islands]].
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Revolution as civil war", "American Indians" ]
Most [[Native Americans in the United States|American Indians]] east of the [[Mississippi River]] were affected by the war, and many tribes were divided over how to respond to the conflict. A few tribes were friendly with the colonists, but most Indians opposed the union of the Colonies as a potential threat to their territory. Approximately 13,000 Indians fought on the British side, with the largest group coming from the [[Iroquois]] tribes who deployed around 1,500 men. Early in July 1776, [[Cherokee]] allies of Britain attacked the short-lived [[Washington District, North Carolina|Washington District]] of [[North Carolina Colony|North Carolina]]. Their defeat splintered both Cherokee settlements and people, and was directly responsible for the rise of the [[Chickamauga Cherokee]], who perpetuated the [[Cherokee–American wars]] against American settlers for decades after hostilities with Britain ended. [[Muscogee people|Creek]] and [[Seminole]] allies of Britain fought against Americans in Georgia and South Carolina. In 1778, a force of 800 Creeks destroyed American settlements along the [[Broad River (Georgia)|Broad River]] in Georgia. Creek warriors also joined [[Thomas Brown (loyalist)|Thomas Brown's]] raids into South Carolina and assisted Britain during the [[Siege of Savannah]]. Many Indians were involved in the fight between Britain and Spain on the Gulf Coast and along the British side of the Mississippi River. Thousands of Creeks, [[Chickasaw]], and [[Choctaw]] fought in major battles such as the [[Battle of Fort Charlotte]], the [[Battle of Mobile (1781)|Battle of Mobile]], and the [[Siege of Pensacola]]. The [[Iroquois Confederacy]] was shattered as a result of the American Revolutionary War, whatever side they took; the [[Seneca nation|Seneca]], [[Onondaga (tribe)|Onondaga]], and [[Cayuga nation|Cayuga]] tribes sided with the British; members of the [[Mohawk nation|Mohawks]] fought on both sides; and many [[Tuscarora (tribe)|Tuscarora]] and [[Oneida tribe|Oneida]] sided with the Americans. To retaliate against raids on American settlement by Loyalists and their Indian allies, the Continental Army dispatched the [[Sullivan Expedition]] on a punitive expedition throughout New York to cripple the Iroquois tribes that had sided with the British. Mohawk leaders [[Joseph Louis Cook]] and [[Joseph Brant]] sided with the Americans and the British respectively, which further exacerbated the split. In the [[western theater of the American Revolutionary War]], conflicts between settlers and Indians led to lingering distrust. In the [[Treaty of Paris (1783)|1783 Treaty of Paris]], Great Britain ceded control of the disputed lands between the Great Lakes and the [[Ohio River]], but the Indian inhabitants were not a part of the peace negotiations. Tribes in the [[Northwest Territory]] joined together as the [[Western Confederacy]] and allied with the British to resist American settlement, and their conflict continued after the Revolutionary War as the [[Northwest Indian War]].
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Britain's \"American war\" and peace", "Changing Prime Ministers" ]
[[Frederick North, 2nd Earl of Guilford|Lord North]], Prime Minister since 1770, delegated control of the war in North America to [[Lord George Germain]] and the [[John Montagu, 4th Earl of Sandwich|Earl of Sandwich]], who was [[First Lord of the Admiralty|head of the Royal Navy]] from 1771 to 1782. Defeat at Saratoga in 1777 made it clear the revolt would not be easily suppressed, especially after the Franco-American alliance of February 1778, and French declaration of war in June. With Spain also expected to join the conflict, the Royal Navy needed to prioritize either the war in America or in Europe; Germain advocated the former, Sandwich the latter. British negotiators now proposed a second peace settlement to Congress. The terms presented by the [[Carlisle Peace Commission]] included acceptance of the principle of self-government. Parliament would recognize Congress as the governing body, suspend any objectionable legislation, surrender its right to local colonial taxation, and discuss including American representatives in the House of Commons. In return, all property confiscated from Loyalists would be returned, British debts honored, and locally enforced martial law accepted. However, Congress demanded either immediate recognition of independence or the withdrawal of all British troops; they knew the commission were not authorized to accept these, bringing negotiations to a rapid end. When the commissioners returned to London in November 1778, they recommended a change in policy. Sir Henry Clinton, the new British Commander-in-Chief in America, was ordered to stop treating the rebels as enemies, rather than subjects whose loyalty might be regained. Those standing orders would be in effect for three years until Clinton was relieved. North backed the Southern strategy hoping to exploit divisions between the mercantile north and slave-owning south, but after Yorktown accepted this policy had failed. It was clear the war was lost, although the Royal Navy forced the French to relocate their fleet to the Caribbean in November 1781 and resumed a close blockade of American trade. The resulting economic damage and rising inflation meant the US was now eager to end the war, while France was unable to provide further loans; Congress could no longer pay its soldiers. On February 27, 1782, a Whig motion to end the offensive war in America was carried by 19 votes. North now resigned, obliging the king to invite [[Charles Watson-Wentworth, 2nd Marquess of Rockingham|Lord Rockingham]] to form a government; a consistent supporter of the Patriot cause, he made a commitment to US independence a condition of doing so. George III reluctantly accepted and the [[Second Rockingham ministry|new government]] took office on March 27, 1782; however, Rockingham died unexpectedly on July 1, and was replaced by [[Shelburne ministry|Lord Shelburne]] who acknowledged American independence.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Britain's \"American war\" and peace", "American Congress signs a peace" ]
When Lord Rockingham, the Whig leader and friend of the American cause was elevated to Prime Minister, Congress consolidated its diplomatic consuls in Europe into a peace delegation at Paris. All were experienced in Congressional leadership. The dean of the delegation was [[Benjamin Franklin]] of Pennsylvania. He had become a celebrity in the French Court, but he was also an Enlightenment scientist with influence in the courts of European great powers in Prussia, England's former ally, and Austria, a Catholic empire like Spain. Since the 1760s he had been an organizer of British American inter-colony cooperation, and then a colonial lobbyist to Parliament in London. [[John Adams]] of Massachusetts had been consul to the Dutch Republic and was a prominent early New England Patriot. [[John Jay]] of New York had been consul to Spain and was a past president of the Continental Congress. As consul to the Dutch Republic, [[Henry Laurens]] of South Carolina had secured a preliminary agreement for a trade agreement. He had been a successor to John Jay as [[President of the Continental Congress|president of Congress]] and with Franklin was a member of the [[American Philosophical Society]]. Although active in the preliminaries, he was not a signer of the conclusive treaty. The Whig negotiators for Lord Rockingham and his successor, Prime Minister Lord Shelburne, included long-time friend of Benjamin Franklin from his time in London, [[David Hartley (the Younger)|David Hartley]] and [[Richard Oswald (merchant)|Richard Oswald]], who had negotiated Laurens' release from the Tower of London. The Preliminary Peace signed on November 30 met four key Congressional demands: independence, territory up to the Mississippi, navigation rights into the Gulf of Mexico, and fishing rights in Newfoundland. British strategy was to strengthen the US sufficiently to prevent France from regaining a foothold in North America, and they had little interest in these proposals. However, divisions between their opponents allowed them to negotiate separately with each to improve their overall position, starting with the American delegation in September 1782. The French and Spanish sought to improve their position by creating the U.S. dependent on them for support against Britain, thus reversing the losses of 1763. Both parties tried to negotiate a settlement with Britain excluding the Americans; France proposed setting the western boundary of the US along the Appalachians, matching the British [[Royal Proclamation of 1763|1763 Proclamation Line]]. The Spanish suggested additional concessions in the vital Mississippi River Basin, but required the cession of [[Georgia in the American Revolution|Georgia]] in violation of the Franco-American alliance.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Britain's \"American war\" and peace", "American Congress signs a peace" ]
Facing difficulties with Spain over claims involving the Mississippi River, and from France who was still reluctant to agree to American independence until all her demands were met, John Jay promptly told the British that he was willing to negotiate directly with them, cutting off France and Spain, and Prime Minister Lord Shelburne, in charge of the British negotiations, agreed. Key agreements for America in obtaining peace included recognition of United States independence, that she would gain all of the area east of the Mississippi River, north of Florida, and south of Canada; the granting of fishing rights in the [[Grand Banks]], off the coast of [[Newfoundland]] and in the Gulf of [[Saint Lawrence]]; the United States and Great Britain were to each be given perpetual access to the Mississippi River. An Anglo-American Preliminary Peace was formally entered into in November 1782, and Congress endorsed the settlement on April 15, 1783. It announced the achievement of peace with independence; the "conclusive" treaty was signed on September 2, 1783, in Paris, effective the next day September 3, when Britain signed its treaty with France. John Adams, who helped draft the treaty, claimed it represented "one of the most important political events that ever happened on the globe". Ratified respectively by Congress and Parliament, the final versions were exchanged in Paris the following spring. On 25 November, the last British troops remaining in the US were evacuated from New York to Halifax.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Aftermath" ]
Washington expressed astonishment that the Americans had won a war against a leading world power, referring to the American victory as "little short of a standing miracle". The conflict between British subjects with the Crown against those with the Congress had lasted over eight years from 1775 to 1783. The last uniformed British troops [[Evacuation Day (New York)|departed]] their last east coast port cities in Savannah, Charleston, and New York City, by November 25, 1783. That marked the end of British occupation in the new United States. On April 9, 1783, Washington issued orders that he had long waited to give, that "all acts of hostility" were to cease immediately. That same day, by arrangement with Washington, [[Guy Carleton, 1st Baron Dorchester|General Carleton]] issued a similar order to British troops. British troops, however, were not to evacuate until a prisoner of war exchange occurred, an effort that involved much negotiation and would take some seven months to effect. As directed by a Congressional resolution of May 26, 1783, all non-commissioned officers and enlisted were furloughed "to their homes" until the "definitive treaty of peace", when they would be automatically discharged. The US armies were directly disbanded in the field as of Washington's General Orders on Monday, June 2, 1783. Once the conclusive Treaty of Paris was signed with Britain, Washington resigned as commander-in-chief at Congress, leaving for his Army retirement at Mount Vernon.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Aftermath", "Territory" ]
The expanse of territory that was now the United States was ceded from its colonial [[Homeland#Motherland|Mother country]] alone. It included millions of sparsely settled acres south of the [[Great Lakes|Great Lakes Line]] between the Appalachian Mountains and the Mississippi River. The tentative colonial migration west became a flood during the years of the Revolutionary War. Virginia's Kentucky County counted 150 men in 1775. By 1790 fifteen years later, it numbered over 73,000 and was seeking statehood in the United States. Britain's extended post-war policy for the US continued to try to establish an Indian buffer state below the Great Lakes as late as 1814 during the [[War of 1812]]. The formally acquired western American lands continued to be populated by a dozen or so American Indian tribes that had been British allies for the most part. Though British forts on their lands had been ceded to either the French or the British prior to the creation of the United States, Indians were not referred to in the British cession to the US. While tribes were not consulted by the British for the treaty, in practice the British refused to abandon the forts on territory they formally transferred. Instead, they provisioned military allies for continuing frontier raids and sponsored the [[Northwest Indian War|Northwest Indian War (1785–1795)]]. British sponsorship of local warfare on the United States continued until the Anglo-American [[Jay Treaty]] went into effect. At the same time, the Spanish also sponsored war within the US by Indian proxies in its Southwest Territory ceded by France to Britain, then Britain to the Americans. Of the European powers with American colonies adjacent to the newly created United States, Spain was most threatened by American independence, and it was correspondingly the most hostile to it. Its territory adjacent to the US was relatively undefended, so Spanish policy developed a combination of initiatives. Spanish soft power diplomatically challenged the British territorial cession west to the Mississippi and the previous northern boundaries of [[Spanish Florida]]. It imposed a high tariff on American goods, then blocked American settler access to the port of New Orleans. Spanish hard power extended war alliances and arms to Southwestern Indians to resist American settlement. A former Continental Army General, [[James Wilkinson]] settled in [[History of Kentucky#Kentucky in the American Revolution (1775–1783)|Kentucky County]] Virginia in 1784, and there he fostered settler secession from Virginia during the Spanish-allied [[Cherokee–American wars|Chickamauga Cherokee war]]. Beginning in 1787, he received pay as Spanish Agent 13, and subsequently expanded his efforts to persuade American settlers west of the Appalachians to secede from the United States, first in the Washington administration, and later again in the Jefferson administration.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Aftermath", "Casualties and losses" ]
The total loss of life throughout the conflict is largely unknown. As was typical in wars of the era, diseases such as [[smallpox]] claimed more lives than battle. Between 1775 and 1782, a [[1775–82 North American smallpox epidemic|smallpox epidemic]] broke out throughout North America, killing an estimated 130,000 among all its populations during those years. Historian [[Joseph Ellis]] suggests that Washington's decision to have his troops [[Variolation|inoculated]] against the disease was one of his most important decisions. Up to 70,000 American Patriots died during active military service. Of these, approximately 6,800 were killed in battle, while at least 17,000 died from disease. The majority of the latter died while [[prisoners of war]] of the British, mostly in the [[Prisoners in the American Revolutionary War|prison ships]] in New York Harbor. The number of Patriots seriously wounded or disabled by the war has been estimated from 8,500 to 25,000. The French suffered 2,112 killed in combat in the United States. The Spanish lost a total of 124 killed and 247 wounded in West Florida. A British report in 1781 puts their total Army deaths at 6,046 in North America (1775–1779). Approximately 7,774 [[Germans in the American Revolution#Allies of Great Britain|Germans]] died in British service in addition to 4,888 deserters; of the former, it is estimated 1,800 were killed in combat.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Aftermath", "Legacy" ]
The American Revolution established the United States with its numerous civil liberties and set an example to overthrow both monarchy and colonial governments. The United States has the world's oldest written constitution, and the constitutions of other free countries often bear a striking resemblance to the US Constitution, often word-for-word in places. It inspired the French, Haitian, Latin American Revolutions, and others into the modern era. Although the Revolution eliminated many forms of inequality, it did little to change the status of women, despite the role they played in winning independence. Most significantly, it failed to end slavery which continued to be a serious social and political issue and caused divisions that would ultimately end in [[American Civil War|civil war]]. While many were uneasy over the contradiction of demanding liberty for some, yet denying it to others, the dependence of southern states on slave labor made abolition too great a challenge. Between 1774 and 1780, many of the states banned the importation of slaves, but the institution itself continued. In 1782, Virginia passed a law permitting manumission and over the next eight years more than 10,000 slaves were given their freedom. With support from Benjamin Franklin, in 1790 the [[Quakers]] petitioned Congress to abolish slavery; the number of abolitionist movements greatly increased, and by 1804 all the northern states had outlawed it. However, even many like Adams who viewed slavery as a 'foul contagion' opposed the 1790 petition as a threat to the Union. In 1808, Jefferson passed legislation [[Act Prohibiting Importation of Slaves|banning the importation of slaves]], but allowed the domestic slave trade to continue, arguing the federal government had no right to regulate individual states.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Aftermath", "Historiography" ]
A large historiography concerns the reasons the Americans revolted and successfully broke away. The "Patriots", an insulting term used by the British that was proudly adopted by the Americans, stressed the constitutional rights of Englishmen, especially "[[No taxation without representation]]." Historians since the 1960s have emphasized that the Patriot constitutional argument was made possible by the emergence of a sense of American nationalism that united all 13 colonies. In turn, that nationalism was rooted in a [[republicanism in the United States|Republican value system]] that demanded consent of the governed and opposed aristocratic control. In Britain itself, republicanism was a fringe view since it challenged the aristocratic control of the British political system. Political power was not controlled by an aristocracy or nobility in the 13 colonies, and instead, the colonial political system was based on the winners of free elections, which were open to the majority of white men. In the analysis of the coming of the Revolution, historians in recent decades have mostly used one of three approaches. The [[Atlantic history]] view places the American story in a broader context, including revolutions in France and Haiti. It tends to reintegrate the historiographies of the American Revolution and the British Empire. The "[[new social history]]" approach looks at community social structure to find cleavages that were magnified into colonial cleavages. The ideological approach that centers on republicanism in the United States. Republicanism dictated there would be no royalty, aristocracy or national church but allowed for continuation of the British common law, which American lawyers and jurists understood and approved and used in their everyday practice. Historians have examined how the rising American legal profession adopted British common law to incorporate republicanism by selective revision of legal customs and by introducing more choices for courts.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[ "Commemorations of the Revolutionary War" ]
After the first U.S. postage stamp was issued in 1849 the U.S. Post Office frequently issued commemorative stamps celebrating the various people and events of the Revolutionary War. However, it would be more than 140 years after the Revolution before any stamp commemorating that war was ever issued. The first such stamp was the 'Liberty Bell' issue of 1926.
771
American Revolutionary War
[ "American Revolutionary War", "Conflicts in 1775", "Conflicts in 1776", "Conflicts in 1777", "Conflicts in 1778", "Conflicts in 1779", "Conflicts in 1780", "Conflicts in 1781", "Conflicts in 1782", "Conflicts in 1783", "Global conflicts", "Rebellions against the British Empire", "Wars between the United Kingdom and the United States", "Wars of independence" ]
[ "Timeline of the American Revolution", "1776 in the United States" ]
[]
The '''ampere''' (, ; symbol: '''A'''), often [[Clipping (morphology)|shortened]] to "amp", is the [[SI base unit|base unit]] of [[electric current]] in the [[International System of Units]] (SI). It is named after [[André-Marie Ampère]] (1775–1836), French mathematician and physicist, considered the father of [[electromagnetism]]. The International System of Units defines the ampere in terms of other base units by measuring the electromagnetic force between electrical conductors carrying electric current. The earlier [[Centimetre–gram–second system of units|CGS system]] had two different definitions of current, one essentially the same as the SI's and the other using [[electric charge]] as the base unit, with the unit of charge defined by measuring the force between two charged metal plates. The ampere was then defined as one [[coulomb]] of charge per second. In SI, the unit of charge, the coulomb, is defined as the charge carried by one ampere during one second. [[2019 redefinition of the SI base units|New definitions]], in terms of invariant constants of nature, specifically the [[elementary charge]], took effect on 20 May 2019.
772
Ampere
[ "SI base units", "Units of electric current" ]
[ "Ammeter", "Magnetic constant", "Ampacity", "Electric shock", "Electric current", "Orders of magnitude (current)", "Hydraulic analogy" ]
[ "Definition" ]
The ampere is defined by taking the fixed numerical value of the [[elementary charge]] to be 1.602 176 634 × 10 when expressed in the unit C, which is equal to A⋅s, where the second is defined in terms of , the unperturbed ground state hyperfine transition frequency of the caesium-133 atom. The SI unit of charge, the [[coulomb]], "is the quantity of electricity carried in 1 second by a current of 1 ampere". Conversely, a current of one ampere is one coulomb of charge going past a given point per second: formula_1 In general, charge is determined by steady current flowing for a time as . Constant, instantaneous and average current are expressed in amperes (as in "the charging current is 1.2 A") and the charge accumulated (or passed through a circuit) over a period of time is expressed in coulombs (as in "the [[battery (electricity)|battery]] charge is "). The relation of the ampere (C/s) to the coulomb is the same as that of the [[watt]] (J/s) to the [[joule]].
772
Ampere
[ "SI base units", "Units of electric current" ]
[ "Ammeter", "Magnetic constant", "Ampacity", "Electric shock", "Electric current", "Orders of magnitude (current)", "Hydraulic analogy" ]
[ "History" ]
The ampere is named for French physicist and mathematician [[André-Marie Ampère]] (1775–1836), who studied [[electromagnetism]] and laid the foundation of [[electrodynamics]]. In recognition of Ampère's contributions to the creation of modern electrical science, an international convention, signed at the 1881 [[International Exposition of Electricity]], established the ampere as a standard unit of electrical measurement for electric current. The ampere was originally defined as one tenth of the unit of [[electric current]] in the [[centimetre–gram–second system of units]]. That unit, now known as the [[abampere]], was defined as the amount of current that generates a force of two [[dyne]] per centimetre of length between two wires one centimetre apart. The size of the unit was chosen so that the units derived from it in the [[MKS system of units|MKSA]] system would be conveniently sized. The "international ampere" was an early realization of the ampere, defined as the current that would deposit of silver per second from a [[silver nitrate]] solution. Later, more accurate measurements revealed that this current is . Since [[power (physics)|power]] is defined as the product of current and voltage, the ampere can alternatively be expressed in terms of the other units using the relationship , and thus 1 A = 1 W/V. Current can be measured by a [[multimeter]], a device that can measure electrical voltage, current, and resistance.
772
Ampere
[ "SI base units", "Units of electric current" ]
[ "Ammeter", "Magnetic constant", "Ampacity", "Electric shock", "Electric current", "Orders of magnitude (current)", "Hydraulic analogy" ]
[ "History", "Former definition in the SI" ]
Until 2019, the SI defined the ampere as follows: The ampere is that constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed one [[metre]] apart in vacuum, would produce between these conductors a force equal to [[newton (unit)|newtons]] per metre of length. [[Ampère's force law]] states that there is an attractive or repulsive force between two parallel wires carrying an electric current. This force is used in the formal definition of the ampere. The SI unit of charge, the [[coulomb]], was then defined as "the quantity of electricity carried in 1 second by a current of 1 ampere". Conversely, a current of one ampere is one coulomb of charge going past a given point per second: formula_2 In general, charge was determined by steady current flowing for a time as .
772
Ampere
[ "SI base units", "Units of electric current" ]
[ "Ammeter", "Magnetic constant", "Ampacity", "Electric shock", "Electric current", "Orders of magnitude (current)", "Hydraulic analogy" ]
[ "Realisation" ]
The standard ampere is most accurately realised using a [[Kibble balance]], but is in practice maintained via [[Ohm's law]] from the units of [[electromotive force]] and [[Electrical resistance and conductance|resistance]], the [[volt]] and the [[ohm]], since the latter two can be tied to physical phenomena that are relatively easy to reproduce, the [[Josephson effect]] and the [[quantum Hall effect]], respectively. Techniques to establish the realisation of an ampere have a [[Approximation error|relative uncertainty]] of approximately a few parts in 10, and involve realisations of the watt, the ohm and the volt.
772
Ampere
[ "SI base units", "Units of electric current" ]
[ "Ammeter", "Magnetic constant", "Ampacity", "Electric shock", "Electric current", "Orders of magnitude (current)", "Hydraulic analogy" ]
[]
[[Image:Euclid flowchart.svg|thumb|right| [[Flowchart]] of an algorithm ([[Euclid's algorithm]]) for calculating the greatest common divisor (g.c.d.) of two numbers ''a'' and ''b'' in locations named A and B. The algorithm proceeds by successive subtractions in two loops: IF the test B ≥ A yields "yes" or "true" (more accurately, the ''number'' ''b'' in location B is greater than or equal to the ''number'' ''a'' in location A) THEN, the algorithm specifies B ← B − A (meaning the number ''b'' − ''a'' replaces the old ''b''). Similarly, IF A > B, THEN A ← A − B. The process terminates when (the contents of) B is 0, yielding the g.c.d. in A. (Algorithm derived from Scott 2009:13; symbols and drawing style from Tausworthe 1977).]] In [[mathematics]] and [[computer science]], an '''algorithm''' () is a finite sequence of [[well-defined]], computer-implementable instructions, typically to solve a class of problems or to perform a computation. Algorithms are always [[unambiguous]] and are used as specifications for performing [[calculation]], [[data processing]], [[automated reasoning]], and other tasks. As an [[effective method]], an algorithm can be expressed within a finite amount of space and time, and in a well-defined formal language for calculating a [[Function (mathematics)|function]]. Starting from an initial state and initial input (perhaps [[Empty string|empty]]), the instructions describe a [[computation]] that, when [[Execution (computing)|executed]], proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily [[deterministic]]; some algorithms, known as [[randomized algorithms]], incorporate random input. The concept of algorithm has existed since antiquity. [[Arithmetic]] algorithms, such as a [[division algorithm]], was used by ancient [[Babylonian mathematics|Babylonian mathematicians]] c. 2500 BC and [[Egyptian mathematics|Egyptian mathematicians]] c. 1550 BC. [[Greek mathematics|Greek mathematicians]] later used algorithms in 240 BC in the [[sieve of Eratosthenes]] for finding prime numbers, and the [[Euclidean algorithm]] for finding the [[greatest common divisor]] of two numbers. [[Arabic mathematics|Arabic mathematicians]] such as [[al-Kindi]] in the 9th century used [[cryptographic]] algorithms for [[code-breaking]], based on [[frequency analysis]]. The word ''algorithm'' itself is derived from the name of the 9th-century mathematician [[Muhammad ibn Musa al-Khwarizmi|Muḥammad ibn Mūsā al-Khwārizmī]], whose [[Nisba (suffix)|nisba]] (identifying him as from [[Khwarazm]]) was Latinized ''as Algoritmi''. A partial formalization of what would become the modern concept of algorithm began with attempts to solve the ''[[Entscheidungsproblem]] '' (decision problem) posed by [[David Hilbert]] in 1928. Later formalizations were framed as attempts to define "[[effective calculability]]" or "effective method". Those formalizations included the [[Kurt Gödel|Gödel]]–[[Jacques Herbrand|Herbrand]]–[[Stephen Cole Kleene|Kleene]] [[Recursion (computer science)|recursive functions]] of 1930, 1934 and 1935, [[Alonzo Church]]'s [[lambda calculus]] of 1936, [[Emil Post]]'s [[Formulation 1]] of 1936, and [[Alan Turing]]'s [[Turing machines]] of 1936–37 and 1939.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Etymology" ]
The word 'algorithm' has its roots in Latinizing the nisba, indicating his geographic origin, of the name of [[Persians|Persian]] mathematician [[Muhammad ibn Musa al-Khwarizmi]] to ''algorismus''. Al-Khwārizmī ([[Arabization|Arabized]] [[Persian language|Persian]] الخوارزمی c. 780–850) was a mathematician, [[astronomer]], [[geographer]], and scholar in the [[House of Wisdom]] in [[Baghdad]], whose name means 'the native of [[Khwarazm]]', a region that was part of [[Greater Iran]] and is now in [[Uzbekistan]]. About 825, al-Khwarizmi wrote an [[Arabic language]] treatise on the [[Hindu–Arabic numeral system]], which was translated into [[Latin]] during the 12th century. The manuscript starts with the phrase ''Dixit Algorizmi'' ('Thus spake Al-Khwarizmi'), where "Algorizmi" was the translator's [[Latinisation of names|Latinization]] of Al-Khwarizmi's name. Al-Khwarizmi was the most widely read mathematician in Europe in the late Middle Ages, primarily through another of his books, the [[Al-Jabr|Algebra]]. In late medieval Latin, ''algorismus'', English '[[algorism]]', the corruption of his name, simply meant the "decimal number system". In the 15th century, under the influence of the Greek word ἀριθμός (''arithmos''), 'number' (''cf.'' 'arithmetic'), the Latin word was altered to ''algorithmus'', and the corresponding English term 'algorithm' is first attested in the 17th century; the modern sense was introduced in the 19th century. In English, it was first used in about 1230 and then by [[Geoffrey Chaucer|Chaucer]] in 1391. English adopted the French term, but it wasn't until the late 19th century that "algorithm" took on the meaning that it has in modern English. Another early use of the word is from 1240, in a manual titled ''Carmen de Algorismo'' composed by [[Alexander of Villedieu|Alexandre de Villedieu]]. It begins with: which translates to: The poem is a few hundred lines long and summarizes the art of calculating with the new styled Indian dice (''Tali Indorum''), or Hindu numerals.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Informal definition" ]
An informal definition could be "a set of rules that precisely defines a sequence of operations", which would include all computer programs (including programs that do not perform numeric calculations), and (for example) any prescribed [[bureaucratic]] procedure or cook-book [[recipe]]. In general, a program is only an algorithm if it stops eventually—even though [[infinite loop#Intentional looping | infinite loop]] may sometimes prove desirable. A prototypical example of an algorithm is the [[Euclidean algorithm]], which is used to determine the maximum common divisor of two integers; an example (there are others) is described by the [[flowchart]] above and as an example in a later section. offer an informal meaning of the word "algorithm" in the following quotation: No human being can write fast enough, or long enough, or small enough† ( †"smaller and smaller without limit … you'd be trying to write on molecules, on atoms, on electrons") to list all members of an enumerably infinite set by writing out their names, one after another, in some notation. But humans can do something equally useful, in the case of certain enumerably infinite sets: They can give ''explicit instructions for determining the '''n'''th member of the set'', for arbitrary finite ''n''. Such instructions are to be given quite explicitly, in a form in which ''they could be followed by a computing machine'', or by a ''human who is capable of carrying out only very elementary operations on symbols.'' An [[recursively enumerable set| "enumerably infinite set"]] is one whose elements can be put into one-to-one correspondence with the integers. Thus Boolos and Jeffrey are saying that an algorithm implies instructions for a process that "creates" output integers from an ''arbitrary'' "input" integer or integers that, in theory, can be arbitrarily large. For example, an algorithm can be an algebraic equation such as ''y = m + n'' (i.e., two arbitrary "input variables" ''m'' and ''n'' that produce an output ''y''), but various authors' attempts to define the notion indicate that the word implies much more than this, something on the order of (for the addition example): Precise instructions (in a language understood by "the computer") for a fast, efficient, "good" process that specifies the "moves" of "the computer" (machine or human, equipped with the necessary internally contained information and capabilities) to find, decode, and then process arbitrary input integers/symbols ''m'' and ''n'', symbols ''+'' and ''='' … and "effectively" produce, in a "reasonable" time, output-integer ''y'' at a specified place and in a specified format. The concept of ''algorithm'' is also used to define the notion of [[decidability (logic)| decidability]]—a notion that is central for explaining how [[formal system]] come into being starting from a small set of [[axiom]] and rules. In [[logic]], the time that an algorithm requires to complete cannot be measured, as it is not apparently related to the customary physical dimension. From such uncertainties, that characterize ongoing work, stems the unavailability of a definition of ''algorithm'' that suits both concrete (in some sense) and abstract usage of the term.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Formalization" ]
Algorithms are essential to the way computers process data. Many computer programs contain algorithms that detail the specific instructions a computer should perform—in a specific order—to carry out a specified task, such as calculating employees' paychecks or printing students' report cards. Thus, an algorithm can be considered to be any sequence of operations that can be simulated by a [[Turing reduction|Turing-complete]] system. Authors who assert this thesis include Minsky (1967), Savage (1987) and Gurevich (2000): Minsky: "But we will also maintain, with Turing … that any procedure which could "naturally" be called effective, can, in fact, be realized by a (simple) machine. Although this may seem extreme, the arguments … in its favor are hard to refute". Gurevich: "… Turing's informal argument in favor of his thesis justifies a stronger thesis: every algorithm can be simulated by a Turing machine … according to Savage [1987], an algorithm is a computational process defined by a Turing machine". Turing machines can define computational processes that do not terminate. The informal definitions of algorithms generally require that the algorithm always terminates. This requirement renders the task of deciding whether a formal procedure is an algorithm impossible in the general case—due to a major theorem of [[computability theory]] known as the [[halting problem]]. Typically, when an algorithm is associated with processing information, data can be read from an input source, written to an output device and stored for further processing. Stored data are regarded as part of the internal state of the entity performing the algorithm. In practice, the state is stored in one or more [[data structure]]. For some of these computational processes, the algorithm must be rigorously defined: specified in the way it applies in all possible circumstances that could arise. This means that any conditional steps must be systematically dealt with, case-by-case; the criteria for each case must be clear (and computable). Because an algorithm is a precise list of precise steps, the order of computation is always crucial to the functioning of the algorithm. Instructions are usually assumed to be listed explicitly, and are described as starting "from the top" and going "down to the bottom"—an idea that is described more formally by ''[[control flow|flow of control]]''. So far, the discussion on the formalization of an algorithm has assumed the premises of [[imperative programming]]. This is the most common conception—one which attempts to describe a task in discrete, "mechanical" means. Unique to this conception of formalized algorithms is the [[assignment operation]], which sets the value of a variable. It derives from the intuition of "[[memory]]" as a scratchpad. An example of such an assignment can be found below. For some alternate conceptions of what constitutes an algorithm, see [[functional programming]] and [[logic programming]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Formalization", "Expressing algorithms" ]
Algorithms can be expressed in many kinds of notation, including [[natural language]], [[pseudocode]], [[flowchart]], [[DRAKON|drakon-charts]], [[programming language]] or [[control table]] (processed by [[Interpreter (computing)|interpreters]]). Natural language expressions of algorithms tend to be verbose and ambiguous, and are rarely used for complex or technical algorithms. Pseudocode, flowcharts, [[DRAKON|drakon-charts]] and control tables are structured ways to express algorithms that avoid many of the ambiguities common in the statements based on natural language. Programming languages are primarily intended for expressing algorithms in a form that can be executed by a computer, but are also often used as a way to define or document algorithms. There is a wide variety of representations possible and one can express a given [[Turing machine]] program as a sequence of machine tables (see [[finite-state machine]], [[state transition table]] and [[control table]] for more), as flowcharts and [[DRAKON|drakon-charts]] (see [[state diagram]] for more), or as a form of rudimentary [[machine code]] or [[assembly code]] called "sets of quadruples" (see [[Turing machine]] for more). Representations of algorithms can be classed into three accepted levels of Turing machine description, as follows: (-) 1 High-level description "…prose to describe an algorithm, ignoring the implementation details. At this level, we do not need to mention how the machine manages its tape or head." (-) 2 Implementation description "…prose used to define the way the Turing machine uses its head and the way that it stores data on its tape. At this level, we do not give details of states or transition function." (-) 3 Formal description Most detailed, "lowest level", gives the Turing machine's "state table". For an example of the simple algorithm "Add m+n" described in all three levels, see [[Algorithm#Examples]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Design" ]
Algorithm design refers to a method or a mathematical process for problem-solving and engineering algorithms. The design of algorithms is part of many solution theories of [[operation research]], such as [[dynamic programming]] and [[Divide and conquer algorithm|divide-and-conquer]]. Techniques for designing and implementing algorithm designs are also called algorithm design patterns, with examples including the template method pattern and the decorator pattern. One of the most important aspects of algorithm design is resource (run-time, memory usage) efficiency; the [[big O notation]] is used to describe e.g. an algorithm's run-time growth as the size of its input increases. Typical steps in the development of algorithms: (1) Problem definition (2) Development of a model (3) Specification of the algorithm (4) Designing an algorithm (5) Checking the [[correctness (computer science)|correctness]] of the algorithm (6) Analysis of algorithm (7) Implementation of algorithm (8) Program testing (9) Documentation preparation
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Implementation" ]
[[Image:TTL npn nand.svg|right|thumb|[[Logical NAND]] algorithm implemented electronically in [[7400 series|7400]] chip]] Most algorithms are intended to be implemented as [[computer programs]]. However, algorithms are also implemented by other means, such as in a [[biological neural network]] (for example, the [[human brain]] implementing [[arithmetic]] or an insect looking for food), in an [[electrical circuit]], or in a mechanical device.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Computer algorithms" ]
In [[computer systems]], an algorithm is basically an instance of [[logic]] written in software by software developers, to be effective for the intended "target" computer(s) to produce ''output'' from given (perhaps null) ''input''. An optimal algorithm, even running in old hardware, would produce faster results than a non-optimal (higher [[time complexity]]) algorithm for the same purpose, running in more efficient hardware; that is why algorithms, like computer hardware, are considered technology. ''"Elegant" (compact) programs, "good" (fast) programs '': The notion of "simplicity and elegance" appears informally in [[Donald Knuth|Knuth]] and precisely in [[Gregory Chaitin|Chaitin]]: Knuth: " … we want ''good'' algorithms in some loosely defined aesthetic sense. One criterion … is the length of time taken to perform the algorithm …. Other criteria are adaptability of the algorithm to computers, its simplicity and elegance, etc" Chaitin: " … a program is 'elegant,' by which I mean that it's the smallest possible program for producing the output that it does" Chaitin prefaces his definition with: "I'll show you can't prove that a program is 'elegant—such a proof would solve the [[Halting problem]] (ibid). ''Algorithm versus function computable by an algorithm'': For a given function multiple algorithms may exist. This is true, even without expanding the available instruction set available to the programmer. Rogers observes that "It is ... important to distinguish between the notion of ''algorithm'', i.e. procedure and the notion of ''function computable by algorithm'', i.e. mapping yielded by procedure. The same function may have several different algorithms". Unfortunately, there may be a tradeoff between goodness (speed) and elegance (compactness)—an elegant program may take more steps to complete a computation than one less elegant. An example that uses Euclid's algorithm appears below. ''Computers (and computors), models of computation'': A computer (or human "computor") is a restricted type of machine, a "discrete deterministic mechanical device" that blindly follows its instructions. Melzak's and Lambek's primitive models reduced this notion to four elements: (i) discrete, distinguishable ''locations'', (ii) discrete, indistinguishable ''counters'' (iii) an agent, and (iv) a list of instructions that are ''effective'' relative to the capability of the agent.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Computer algorithms" ]
Minsky describes a more congenial variation of Lambek's "abacus" model in his "Very Simple Bases for [[Computability]]". [[Minsky machine|Minsky's machine]] proceeds sequentially through its five (or six, depending on how one counts) instructions unless either a conditional IF-THEN GOTO or an unconditional GOTO changes program flow out of sequence. Besides HALT, Minsky's machine includes three ''assignment'' (replacement, substitution) operations: ZERO (e.g. the contents of location replaced by 0: L ← 0), SUCCESSOR (e.g. L ← L+1), and DECREMENT (e.g. L ← L − 1). Rarely must a programmer write "code" with such a limited instruction set. But Minsky shows (as do Melzak and Lambek) that his machine is [[Turing complete]] with only four general ''types'' of instructions: conditional GOTO, unconditional GOTO, assignment/replacement/substitution, and HALT. However, a few different assignment instructions (e.g. DECREMENT, INCREMENT, and ZERO/CLEAR/EMPTY for a Minsky machine) are also required for Turing-completeness; their exact specification is somewhat up to the designer. The unconditional GOTO is a convenience; it can be constructed by initializing a dedicated location to zero e.g. the instruction " Z ← 0 "; thereafter the instruction IF Z=0 THEN GOTO xxx is unconditional. ''Simulation of an algorithm: computer (computor) language'': Knuth advises the reader that "the best way to learn an algorithm is to try it . . . immediately take pen and paper and work through an example". But what about a simulation or execution of the real thing? The programmer must translate the algorithm into a language that the simulator/computer/computor can ''effectively'' execute. Stone gives an example of this: when computing the roots of a quadratic equation the computor must know how to take a square root. If they don't, then the algorithm, to be effective, must provide a set of rules for extracting a square root. This means that the programmer must know a "language" that is effective relative to the target computing agent (computer/computor). But what model should be used for the simulation? Van Emde Boas observes "even if we base [[Computational complexity theory|complexity theory]] on abstract instead of concrete machines, arbitrariness of the choice of a model remains. It is at this point that the notion of ''simulation'' enters". When speed is being measured, the instruction set matters. For example, the subprogram in Euclid's algorithm to compute the remainder would execute much faster if the programmer had a "[[modular arithmetic|modulus]]" instruction available rather than just subtraction (or worse: just Minsky's "decrement").
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Computer algorithms" ]
''Structured programming, canonical structures'': Per the [[Church–Turing thesis]], any algorithm can be computed by a model known to be [[Turing complete]], and per Minsky's demonstrations, Turing completeness requires only four instruction types—conditional GOTO, unconditional GOTO, assignment, HALT. Kemeny and Kurtz observe that, while "undisciplined" use of unconditional GOTOs and conditional IF-THEN GOTOs can result in "[[spaghetti code]]", a programmer can write structured programs using only these instructions; on the other hand "it is also possible, and not too hard, to write badly structured programs in a structured language". Tausworthe augments the three [[Structured program theorem|Böhm-Jacopini canonical structures]]: SEQUENCE, IF-THEN-ELSE, and WHILE-DO, with two more: DO-WHILE and CASE. An additional benefit of a structured program is that it lends itself to [[proof of correctness|proofs of correctness]] using [[mathematical induction]]. ''Canonical flowchart symbols'': The graphical aide called a [[flowchart]], offers a way to describe and document an algorithm (and a computer program of one). Like the program flow of a Minsky machine, a flowchart always starts at the top of a page and proceeds down. Its primary symbols are only four: the directed arrow showing program flow, the rectangle (SEQUENCE, GOTO), the diamond (IF-THEN-ELSE), and the dot (OR-tie). The Böhm–Jacopini canonical structures are made of these primitive shapes. Sub-structures can "nest" in rectangles, but only if a single exit occurs from the superstructure. The symbols, and their use to build the canonical structures are shown in the diagram.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Algorithm example" ]
One of the simplest algorithms is to find the largest number in a list of numbers of random order. Finding the solution requires looking at every number in the list. From this follows a simple algorithm, which can be stated in a high-level description in English prose, as: ''High-level description:'' (1) If there are no numbers in the set then there is no highest number. (2) Assume the first number in the set is the largest number in the set. (3) For each remaining number in the set: if this number is larger than the current largest number, consider this number to be the largest number in the set. (4) When there are no numbers left in the set to iterate over, consider the current largest number to be the largest number of the set. ''(Quasi-)formal description:'' Written in prose but much closer to the high-level language of a computer program, the following is the more formal coding of the algorithm in [[pseudocode]] or [[pidgin code]]: Input: A list of numbers ''L''. Output: The largest number in the list ''L''. '''if''' ''L.size'' = 0 '''return''' null ''largest'' ← ''L''[0] '''for each''' ''item'' '''in''' ''L'', '''do''' '''if''' ''item'' > ''largest'', '''then''' ''largest'' ← ''item'' '''return''' ''largest''
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Euclid's algorithm" ]
[[Euclid]]'s algorithm to compute the [[greatest common divisor]] (GCD) to two numbers appears as Proposition II in Book VII ("Elementary Number Theory") of his ''[[Euclid's Elements|Elements]]''. Euclid poses the problem thus: "Given two numbers not prime to one another, to find their greatest common measure". He defines "A number [to be] a multitude composed of units": a counting number, a positive integer not including zero. To "measure" is to place a shorter measuring length ''s'' successively (''q'' times) along longer length ''l'' until the remaining portion ''r'' is less than the shorter length ''s''. In modern words, remainder ''r'' = ''l'' − ''q''×''s'', ''q'' being the quotient, or remainder ''r'' is the "modulus", the integer-fractional part left over after the division. For Euclid's method to succeed, the starting lengths must satisfy two requirements: (i) the lengths must not be zero, AND (ii) the subtraction must be "proper"; i.e., a test must guarantee that the smaller of the two numbers is subtracted from the larger (or the two can be equal so their subtraction yields zero). Euclid's original proof adds a third requirement: the two lengths must not be prime to one another. Euclid stipulated this so that he could construct a [[reductio ad absurdum]] proof that the two numbers' common measure is in fact the ''greatest''. While Nicomachus' algorithm is the same as Euclid's, when the numbers are prime to one another, it yields the number "1" for their common measure. So, to be precise, the following is really Nicomachus' algorithm.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Euclid's algorithm", "Computer language for Euclid's algorithm" ]
Only a few instruction ''types'' are required to execute Euclid's algorithm—some logical tests (conditional GOTO), unconditional GOTO, assignment (replacement), and subtraction. (-) A ''location'' is symbolized by upper case letter(s), e.g. S, A, etc. (-) The varying quantity (number) in a location is written in lower case letter(s) and (usually) associated with the location's name. For example, location L at the start might contain the number ''l'' = 3009.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Euclid's algorithm", "An inelegant program for Euclid's algorithm" ]
The following algorithm is framed as Knuth's four-step version of Euclid's and Nicomachus', but, rather than using division to find the remainder, it uses successive subtractions of the shorter length ''s'' from the remaining length ''r'' until ''r'' is less than ''s''. The high-level description, shown in boldface, is adapted from Knuth 1973:2–4: '''INPUT''': [Into two locations L and S put the numbers ''l'' and ''s'' that represent the two lengths]: INPUT L, S [Initialize R: make the remaining length ''r'' equal to the starting/initial/input length ''l'']: R ← L '''E0: [Ensure ''r'' ≥ ''s''.]''' [Ensure the smaller of the two numbers is in S and the larger in R]: IF R > S THEN the contents of L is the larger number so skip over the exchange-steps [[#4|4]], [[#5|5]] and [[#6|6]]: GOTO step [[#7|7]] ELSE swap the contents of R and S. L ← R (this first step is redundant, but is useful for later discussion). R ← S S ← L '''E1: [Find remainder]''': Until the remaining length ''r'' in R is less than the shorter length ''s'' in S, repeatedly subtract the measuring number ''s'' in S from the remaining length ''r'' in R. IF S > R THEN done measuring so GOTO [[#10|10]] ELSE measure again, R ← R − S [Remainder-loop]: GOTO [[#7|7]]. '''E2: [Is the remainder zero?]''': EITHER (i) the last measure was exact, the remainder in R is zero, and the program can halt, OR (ii) the algorithm must continue: the last measure left a remainder in R less than measuring number in S. IF R = 0 THEN done so GOTO [[#15|step 15]] ELSE CONTINUE TO [[#11|step 11]], '''E3: [Interchange ''s'' and ''r'']''': The nut of Euclid's algorithm. Use remainder ''r'' to measure what was previously smaller number ''s''; L serves as a temporary location. L ← R R ← S S ← L [Repeat the measuring process]: GOTO [[#7|7]] '''OUTPUT''': [Done. S contains the [[greatest common divisor]]]: PRINT S '''DONE''': HALT, END, STOP.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Euclid's algorithm", "An elegant program for Euclid's algorithm" ]
The flowchart of "Elegant" can be found at the top of this article. In the (unstructured) Basic language, the steps are numbered, and the instruction LET [] = [] is the assignment instruction symbolized by ←. 5 REM Euclid's algorithm for greatest common divisor 6 PRINT "Type two integers greater than 0" 10 INPUT A,B 20 IF B=0 THEN GOTO 80 30 IF A > B THEN GOTO 60 40 LET B=B-A 50 GOTO 20 60 LET A=A-B 70 GOTO 20 80 PRINT A 90 END ''How "Elegant" works'': In place of an outer "Euclid loop", "Elegant" shifts back and forth between two "co-loops", an A > B loop that computes A ← A − B, and a B ≤ A loop that computes B ← B − A. This works because, when at last the minuend M is less than or equal to the subtrahend S (Difference = Minuend − Subtrahend), the minuend can become ''s'' (the new measuring length) and the subtrahend can become the new ''r'' (the length to be measured); in other words the "sense" of the subtraction reverses. The following version can be used with [[List of C-family programming languages|programming languages from the C-family]]: // Euclid's algorithm for greatest common divisor int euclidAlgorithm (int A, int B){ A=abs(A); B=abs(B); while (B!=0){ if (A>B) A=A-B; else B=B-A; return A;
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Testing the Euclid algorithms" ]
Does an algorithm do what its author wants it to do? A few test cases usually give some confidence in the core functionality. But tests are not enough. For test cases, one source uses 3009 and 884. Knuth suggested 40902, 24140. Another interesting case is the two [[relatively prime]] numbers 14157 and 5950. But "exceptional cases" must be identified and tested. Will "Inelegant" perform properly when R > S, S > R, R = S? Ditto for "Elegant": B > A, A > B, A = B? (Yes to all). What happens when one number is zero, both numbers are zero? ("Inelegant" computes forever in all cases; "Elegant" computes forever when A = 0.) What happens if ''negative'' numbers are entered? Fractional numbers? If the input numbers, i.e. the [[domain of a function|domain of the function]] computed by the algorithm/program, is to include only positive integers including zero, then the failures at zero indicate that the algorithm (and the program that [[instance (computer science)|instantiates]] it) is a [[partial function]] rather than a [[total function]]. A notable failure due to exceptions is the [[Ariane 5 Flight 501]] rocket failure (June 4, 1996). ''Proof of program correctness by use of mathematical induction'': Knuth demonstrates the application of [[mathematical induction]] to an "extended" version of Euclid's algorithm, and he proposes "a general method applicable to proving the validity of any algorithm". Tausworthe proposes that a measure of the complexity of a program be the length of its correctness proof.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Examples", "Measuring and improving the Euclid algorithms" ]
''Elegance (compactness) versus goodness (speed)'': With only six core instructions, "Elegant" is the clear winner, compared to "Inelegant" at thirteen instructions. However, "Inelegant" is ''faster'' (it arrives at HALT in fewer steps). [[Algorithm analysis]] indicates why this is the case: "Elegant" does ''two'' conditional tests in every subtraction loop, whereas "Inelegant" only does one. As the algorithm (usually) requires many loop-throughs, ''on average'' much time is wasted doing a "B = 0?" test that is needed only after the remainder is computed. ''Can the algorithms be improved?'': Once the programmer judges a program "fit" and "effective"—that is, it computes the function intended by its author—then the question becomes, can it be improved? The compactness of "Inelegant" can be improved by the elimination of five steps. But Chaitin proved that compacting an algorithm cannot be automated by a generalized algorithm; rather, it can only be done [[heuristic]]; i.e., by exhaustive search (examples to be found at [[Busy beaver]]), trial and error, cleverness, insight, application of [[inductive reasoning]], etc. Observe that steps 4, 5 and 6 are repeated in steps 11, 12 and 13. Comparison with "Elegant" provides a hint that these steps, together with steps 2 and 3, can be eliminated. This reduces the number of core instructions from thirteen to eight, which makes it "more elegant" than "Elegant", at nine steps. The speed of "Elegant" can be improved by moving the "B=0?" test outside of the two subtraction loops. This change calls for the addition of three instructions (B = 0?, A = 0?, GOTO). Now "Elegant" computes the example-numbers faster; whether this is always the case for any given A, B, and R, S would require a detailed analysis.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Algorithmic analysis" ]
It is frequently important to know how much of a particular resource (such as time or storage) is theoretically required for a given algorithm. Methods have been developed for the [[analysis of algorithms]] to obtain such quantitative answers (estimates); for example, the sorting algorithm above has a time requirement of O(''n''), using the [[big O notation]] with ''n'' as the length of the list. At all times the algorithm only needs to remember two values: the largest number found so far, and its current position in the input list. Therefore, it is said to have a space requirement of ''O(1)'', if the space required to store the input numbers is not counted, or O(''n'') if it is counted. Different algorithms may complete the same task with a different set of instructions in less or more time, space, or '[[algorithmic efficiency|effort]]' than others. For example, a [[binary search]] algorithm (with cost O(log n) ) outperforms a sequential search (cost O(n) ) when used for [[lookup table|table lookups]] on sorted lists or arrays.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Algorithmic analysis", "Formal versus empirical" ]
The [[analysis of algorithms|analysis, and study of algorithms]] is a discipline of [[computer science]], and is often practiced abstractly without the use of a specific [[programming language]] or implementation. In this sense, algorithm analysis resembles other mathematical disciplines in that it focuses on the underlying properties of the algorithm and not on the specifics of any particular implementation. Usually [[pseudocode]] is used for analysis as it is the simplest and most general representation. However, ultimately, most algorithms are usually implemented on particular hardware/software platforms and their [[algorithmic efficiency]] is eventually put to the test using real code. For the solution of a "one off" problem, the efficiency of a particular algorithm may not have significant consequences (unless n is extremely large) but for algorithms designed for fast interactive, commercial or long life scientific usage it may be critical. Scaling from small n to large n frequently exposes inefficient algorithms that are otherwise benign. Empirical testing is useful because it may uncover unexpected interactions that affect performance. [[Benchmark (computing)|Benchmarks]] may be used to compare before/after potential improvements to an algorithm after program optimization. Empirical tests cannot replace formal analysis, though, and are not trivial to perform in a fair manner.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Algorithmic analysis", "Execution efficiency" ]
To illustrate the potential improvements possible even in well-established algorithms, a recent significant innovation, relating to [[Fast Fourier transform|FFT]] algorithms (used heavily in the field of image processing), can decrease processing time up to 1,000 times for applications like medical imaging. In general, speed improvements depend on special properties of the problem, which are very common in practical applications. Speedups of this magnitude enable computing devices that make extensive use of image processing (like digital cameras and medical equipment) to consume less power.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification" ]
There are various ways to classify algorithms, each with its own merits.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "By implementation" ]
One way to classify algorithms is by implementation means. (-) Recursion A [[recursive algorithm]] is one that invokes (makes reference to) itself repeatedly until a certain condition (also known as termination condition) matches, which is a method common to [[functional programming]]. [[Iteration|Iterative]] algorithms use repetitive constructs like [[Program loops|loops]] and sometimes additional data structures like [[Stack (data structure)|stacks]] to solve the given problems. Some problems are naturally suited for one implementation or the other. For example, [[towers of Hanoi]] is well understood using recursive implementation. Every recursive version has an equivalent (but possibly more or less complex) iterative version, and vice versa. (-) Logical An algorithm may be viewed as controlled [[Deductive reasoning|logical deduction]]. This notion may be expressed as: ''Algorithm = logic + control''. The logic component expresses the axioms that may be used in the computation and the control component determines the way in which deduction is applied to the axioms. This is the basis for the [[logic programming]] paradigm. In pure logic programming languages, the control component is fixed and algorithms are specified by supplying only the logic component. The appeal of this approach is the elegant [[Formal semantics of programming languages|semantics]]: a change in the axioms produces a well-defined change in the algorithm. (-) Serial, parallel or distributed Algorithms are usually discussed with the assumption that computers execute one instruction of an algorithm at a time. Those computers are sometimes called serial computers. An [[algorithm design]] for such an environment is called a serial algorithm, as opposed to [[parallel algorithm]] or [[distributed algorithms]]. Parallel algorithms take advantage of computer architectures where several processors can work on a problem at the same time, whereas distributed algorithms utilize multiple machines connected with a [[computer network]]. Parallel or distributed algorithms divide the problem into more symmetrical or asymmetrical subproblems and collect the results back together. The resource consumption in such algorithms is not only processor cycles on each processor but also the communication overhead between the processors. Some sorting algorithms can be parallelized efficiently, but their communication overhead is expensive. Iterative algorithms are generally parallelizable. Some problems have no parallel algorithms and are called inherently serial problems. (-) Deterministic or non-deterministic [[Deterministic algorithm]] solve the problem with exact decision at every step of the algorithm whereas [[non-deterministic algorithm]] solve problems via guessing although typical guesses are made more accurate through the use of [[heuristics]]. (-) Exact or approximate While many algorithms reach an exact solution, [[approximation algorithm]] seek an approximation that is closer to the true solution. The approximation can be reached by either using a deterministic or a random strategy. Such algorithms have practical value for many hard problems. One of the examples of an approximate algorithm is the [[Knapsack problem]], where there is a set of given items. Its goal is to pack the knapsack to get the maximum total value. Each item has some weight and some value. Total weight that can be carried is no more than some fixed number X. So, the solution must consider weights of items as well as their value.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "By implementation" ]
(-) [[Quantum algorithm]] They run on a realistic model of [[quantum computation]]. The term is usually used for those algorithms which seem inherently quantum, or use some essential feature of [[Quantum computing]] such as [[quantum superposition]] or [[quantum entanglement]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "By design paradigm" ]
Another way of classifying algorithms is by their design methodology or [[algorithmic paradigm|paradigm]]. There is a certain number of paradigms, each different from the other. Furthermore, each of these categories includes many different types of algorithms. Some common paradigms are: (-) [[Brute force search|Brute-force]] or exhaustive search This is the [[Naïve algorithm|naive method]] of trying every possible solution to see which is best. (-) Divide and conquer A [[divide and conquer algorithm]] repeatedly reduces an instance of a problem to one or more smaller instances of the same problem (usually [[recursion|recursively]]) until the instances are small enough to solve easily. One such example of divide and conquer is [[mergesort|merge sorting]]. Sorting can be done on each segment of data after dividing data into segments and sorting of entire data can be obtained in the conquer phase by merging the segments. A simpler variant of divide and conquer is called a ''decrease and conquer algorithm'', which solves an identical subproblem and uses the solution of this subproblem to solve the bigger problem. Divide and conquer divides the problem into multiple subproblems and so the conquer stage is more complex than decrease and conquer algorithms. An example of a decrease and conquer algorithm is the [[binary search algorithm]]. (-) Search and enumeration Many problems (such as playing [[chess]]) can be modeled as problems on [[graph theory|graphs]]. A [[graph exploration algorithm]] specifies rules for moving around a graph and is useful for such problems. This category also includes [[search algorithm]], [[branch and bound]] enumeration and [[backtracking]]. (-) [[Randomized algorithm]] Such algorithms make some choices randomly (or pseudo-randomly). They can be very useful in finding approximate solutions for problems where finding exact solutions can be impractical (see heuristic method below). For some of these problems, it is known that the fastest approximations must involve some [[randomness]]. Whether randomized algorithms with [[P (complexity)|polynomial time complexity]] can be the fastest algorithms for some problems is an open question known as the [[P versus NP problem]]. There are two large classes of such algorithms: (5) [[Monte Carlo algorithm]] return a correct answer with high-probability. E.g. [[RP (complexity)|RP]] is the subclass of these that run in [[polynomial time]]. (6) [[Las Vegas algorithm]] always return the correct answer, but their running time is only probabilistically bound, e.g. [[Zero-error Probabilistic Polynomial time|ZPP]]. (-) [[Reduction (complexity)|Reduction of complexity]] This technique involves solving a difficult problem by transforming it into a better-known problem for which we have (hopefully) [[asymptotically optimal]] algorithms. The goal is to find a reducing algorithm whose [[Computational complexity theory|complexity]] is not dominated by the resulting reduced algorithm's. For example, one [[selection algorithm]] for finding the median in an unsorted list involves first sorting the list (the expensive portion) and then pulling out the middle element in the sorted list (the cheap portion). This technique is also known as ''[[Transform and conquer algorithm|transform and conquer]]''. (-) [[Back tracking]] In this approach, multiple solutions are built incrementally and abandoned when it is determined that they cannot lead to a valid full solution.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "Optimization problems" ]
For [[optimization problem]] there is a more specific classification of algorithms; an algorithm for such problems may fall into one or more of the general categories described above as well as into one of the following: (-) [[Linear programming]] When searching for optimal solutions to a linear function bound to linear equality and inequality constraints, the constraints of the problem can be used directly in producing the optimal solutions. There are algorithms that can solve any problem in this category, such as the popular [[simplex algorithm]]. Problems that can be solved with linear programming include the [[maximum flow problem]] for directed graphs. If a problem additionally requires that one or more of the unknowns must be an [[integer]] then it is classified in [[integer programming]]. A linear programming algorithm can solve such a problem if it can be proved that all restrictions for integer values are superficial, i.e., the solutions satisfy these restrictions anyway. In the general case, a specialized algorithm or an algorithm that finds approximate solutions is used, depending on the difficulty of the problem. (-) [[Dynamic programming]] When a problem shows [[optimal substructure]]—meaning the optimal solution to a problem can be constructed from optimal solutions to subproblems—and [[overlapping subproblems]], meaning the same subproblems are used to solve many different problem instances, a quicker approach called ''dynamic programming'' avoids recomputing solutions that have already been computed. For example, [[Floyd–Warshall algorithm]], the shortest path to a goal from a vertex in a weighted [[graph (discrete mathematics)|graph]] can be found by using the shortest path to the goal from all adjacent vertices. Dynamic programming and [[memoization]] go together. The main difference between dynamic programming and divide and conquer is that subproblems are more or less independent in divide and conquer, whereas subproblems overlap in dynamic programming. The difference between dynamic programming and straightforward recursion is in caching or memoization of recursive calls. When subproblems are independent and there is no repetition, memoization does not help; hence dynamic programming is not a solution for all complex problems. By using memoization or maintaining a [[Mathematical table|table]] of subproblems already solved, dynamic programming reduces the exponential nature of many problems to polynomial complexity. (-) The greedy method A [[greedy algorithm]] is similar to a dynamic programming algorithm in that it works by examining substructures, in this case not of the problem but of a given solution. Such algorithms start with some solution, which may be given or have been constructed in some way, and improve it by making small modifications. For some problems they can find the optimal solution while for others they stop at [[local optimum|local optima]], that is, at solutions that cannot be improved by the algorithm but are not optimum. The most popular use of greedy algorithms is for finding the minimal spanning tree where finding the optimal solution is possible with this method. [[Huffman coding|Huffman Tree]], [[kruskal's algorithm|Kruskal]], [[Prim's algorithm|Prim]], [[Sollin's algorithm|Sollin]] are greedy algorithms that can solve this optimization problem. (-) The heuristic method
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "Optimization problems" ]
In [[optimization problem]], [[heuristic algorithm]] can be used to find a solution close to the optimal solution in cases where finding the optimal solution is impractical. These algorithms work by getting closer and closer to the optimal solution as they progress. In principle, if run for an infinite amount of time, they will find the optimal solution. Their merit is that they can find a solution very close to the optimal solution in a relatively short time. Such algorithms include [[local search (optimization)|local search]], [[tabu search]], [[simulated annealing]], and [[genetic algorithm]]. Some of them, like simulated annealing, are non-deterministic algorithms while others, like tabu search, are deterministic. When a bound on the error of the non-optimal solution is known, the algorithm is further categorized as an [[approximation algorithm]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "By field of study" ]
Every field of science has its own problems and needs efficient algorithms. Related problems in one field are often studied together. Some example classes are [[search algorithm]], [[sorting algorithm]], [[merge algorithm]], [[numerical analysis|numerical algorithms]], [[graph theory|graph algorithms]], [[string algorithms]], [[computational geometry|computational geometric algorithms]], [[combinatorial|combinatorial algorithms]], [[medical algorithm]], [[machine learning]], [[cryptography]], [[data compression]] algorithms and [[parsing|parsing techniques]]. Fields tend to overlap with each other, and algorithm advances in one field may improve those of other, sometimes completely unrelated, fields. For example, dynamic programming was invented for optimization of resource consumption in industry but is now used in solving a broad range of problems in many fields.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Classification", "By complexity" ]
Algorithms can be classified by the amount of time they need to complete compared to their input size: (-) Constant time: if the time needed by the algorithm is the same, regardless of the input size. E.g. an access to an [[Array data structure|array]] element. (-) Logarithmic time: if the time is a logarithmic function of the input size. E.g. [[binary search algorithm]]. (-) Linear time: if the time is proportional to the input size. E.g. the traverse of a list. (-) Polynomial time: if the time is a power of the input size. E.g. the [[bubble sort]] algorithm has quadratic time complexity. (-) Exponential time: if the time is an exponential function of the input size. E.g. [[Brute-force search]]. Some problems may have multiple algorithms of differing complexity, while other problems might have no algorithms or no known efficient algorithms. There are also mappings from some problems to other problems. Owing to this, it was found to be more suitable to classify the problems themselves instead of the algorithms into equivalence classes based on the complexity of the best possible algorithms for them.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Continuous algorithms" ]
The adjective "continuous" when applied to the word "algorithm" can mean: (-) An algorithm operating on data that represents continuous quantities, even though this data is represented by discrete approximations—such algorithms are studied in [[numerical analysis]]; or (-) An algorithm in the form of a [[differential equation]] that operates continuously on the data, running on an [[analog computer]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "Legal issues" ]
Algorithms, by themselves, are not usually patentable. In the United States, a claim consisting solely of simple manipulations of abstract concepts, numbers, or signals does not constitute "processes" (USPTO 2006), and hence algorithms are not patentable (as in [[Gottschalk v. Benson]]). However practical applications of algorithms are sometimes patentable. For example, in [[Diamond v. Diehr]], the application of a simple [[feedback]] algorithm to aid in the curing of [[synthetic rubber]] was deemed patentable. The [[Software patent debate|patenting of software]] is highly controversial, and there are highly criticized patents involving algorithms, especially [[data compression]] algorithms, such as [[Unisys]]' [[Graphics Interchange Format#Unisys and LZW patent enforcement|LZW patent]]. Additionally, some cryptographic algorithms have export restrictions (see [[export of cryptography]]).
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Ancient Near East" ]
The earliest evidence of algorithms is found in the [[Babylonian mathematics]] of ancient [[Mesopotamia]] (modern Iraq). A [[Sumer]] clay tablet found in [[Shuruppak]] near [[Baghdad]] and dated to circa 2500 BC described the earliest [[division algorithm]]. During the [[First Babylonian dynasty|Hammurabi dynasty]] circa 1800-1600 BC, [[Babylonia]] clay tablets described algorithms for computing [[formulas]]. Algorithms were also used in [[Babylonian astronomy]]. Babylonian clay tablets describe and employ algorithmic procedures to compute the time and place of significant astronomical events. Algorithms for arithmetic are also found in ancient [[Egyptian mathematics]], dating back to the [[Rhind Mathematical Papyrus]] circa 1550 BC. Algorithms were later used in ancient [[Hellenistic mathematics]]. Two examples are the [[Sieve of Eratosthenes]], which was described in the ''[[Introduction to Arithmetic]]'' by [[Nicomachus]], and the [[Euclidean algorithm]], which was first described in ''[[Euclid's Elements]]'' (c. 300 BC).
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Discrete and distinguishable symbols" ]
Tally-marks: To keep track of their flocks, their sacks of grain and their money the ancients used tallying: accumulating stones or marks scratched on sticks or making discrete symbols in clay. Through the Babylonian and Egyptian use of marks and symbols, eventually [[Roman numerals]] and the [[abacus]] evolved (Dilson, p. 16–41). Tally marks appear prominently in [[unary numeral system]] arithmetic used in [[Turing machine]] and [[Post–Turing machine]] computations.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Manipulation of symbols as \"place holders\" for numbers: algebra" ]
[[Muhammad ibn Mūsā al-Khwārizmī]], a [[Mathematics in medieval Islam|Persian mathematician]], wrote the ''[[Al-jabr]]'' in the 9th century. The terms "[[algorism]]" and "algorithm" are derived from the name al-Khwārizmī, while the term "[[algebra]]" is derived from the book ''Al-jabr''. In Europe, the word "algorithm" was originally used to refer to the sets of rules and techniques used by Al-Khwarizmi to solve algebraic equations, before later being generalized to refer to any set of rules or techniques. This eventually culminated in [[Gottfried Leibniz|Leibniz]]'s notion of the [[calculus ratiocinator]] (ca 1680):
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Cryptographic algorithms" ]
The first [[cryptographic]] algorithm for deciphering encrypted code was developed by [[Al-Kindi]], a 9th-century [[Mathematics in medieval Islam|Arab mathematician]], in ''A Manuscript On Deciphering Cryptographic Messages''. He gave the first description of [[cryptanalysis]] by [[frequency analysis]], the earliest [[codebreaking]] algorithm.
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Mechanical contrivances with discrete states" ]
''The clock'': Bolter credits the invention of the weight-driven [[clock]] as "The key invention [of Europe in the Middle Ages]", in particular, the [[verge escapement]] that provides us with the tick and tock of a mechanical clock. "The accurate automatic machine" led immediately to "mechanical [[automata theory|automata]]" beginning in the 13th century and finally to "computational machines"—the [[difference engine]] and [[analytical engine]] of [[Charles Babbage]] and Countess [[Ada Lovelace]], mid-19th century. Lovelace is credited with the first creation of an algorithm intended for processing on a computer—Babbage's analytical engine, the first device considered a real [[Turing-complete]] computer instead of just a [[calculator]]—and is sometimes called "history's first programmer" as a result, though a full implementation of Babbage's second device would not be realized until decades after her lifetime. ''Logical machines 1870 – [[Stanley Jevons]]' "logical abacus" and "logical machine"'': The technical problem was to reduce [[Boolean equation]] when presented in a form similar to what is now known as [[Karnaugh map]]. Jevons (1880) describes first a simple "abacus" of "slips of wood furnished with pins, contrived so that any part or class of the [logical] combinations can be picked out mechanically ... More recently, however, I have reduced the system to a completely mechanical form, and have thus embodied the whole of the indirect process of inference in what may be called a ''Logical Machine''" His machine came equipped with "certain moveable wooden rods" and "at the foot are 21 keys like those of a piano [etc] ...". With this machine he could analyze a "[[syllogism]] or any other simple logical argument". This machine he displayed in 1870 before the Fellows of the Royal Society. Another logician [[John Venn]], however, in his 1881 ''Symbolic Logic'', turned a jaundiced eye to this effort: "I have no high estimate myself of the interest or importance of what are sometimes called logical machines ... it does not seem to me that any contrivances at present known or likely to be discovered really deserve the name of logical machines"; see more at [[Algorithm characterizations]]. But not to be outdone he too presented "a plan somewhat analogous, I apprehend, to Prof. Jevon's ''abacus'' ... [And] [a]gain, corresponding to Prof. Jevons's logical machine, the following contrivance may be described. I prefer to call it merely a logical-diagram machine ... but I suppose that it could do very completely all that can be rationally expected of any logical machine".
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Mechanical contrivances with discrete states" ]
''Jacquard loom, Hollerith punch cards, telegraphy and telephony – the electromechanical relay'': Bell and Newell (1971) indicate that the [[Jacquard loom]] (1801), precursor to [[Hollerith cards]] (punch cards, 1887), and "telephone switching technologies" were the roots of a tree leading to the development of the first computers. By the mid-19th century the [[telegraph]], the precursor of the telephone, was in use throughout the world, its discrete and distinguishable encoding of letters as "dots and dashes" a common sound. By the late 19th century the [[ticker tape]] (ca 1870s) was in use, as was the use of Hollerith cards in the 1890 U.S. census. Then came the [[teleprinter]] (ca. 1910) with its punched-paper use of [[Baudot code]] on tape. ''Telephone-switching networks'' of electromechanical [[relay]] (invented 1835) was behind the work of [[George Stibitz]] (1937), the inventor of the digital adding device. As he worked in Bell Laboratories, he observed the "burdensome' use of mechanical calculators with gears. "He went home one evening in 1937 intending to test his idea... When the tinkering was over, Stibitz had constructed a binary adding device". Davis (2000) observes the particular importance of the electromechanical relay (with its two "binary states" ''open'' and ''closed''): It was only with the development, beginning in the 1930s, of electromechanical calculators using electrical relays, that machines were built having the scope Babbage had envisioned."
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Mathematics during the 19th century up to the mid-20th century" ]
''Symbols and rules'': In rapid succession, the mathematics of [[George Boole]] (1847, 1854), [[Gottlob Frege]] (1879), and [[Giuseppe Peano]] (1888–1889) reduced arithmetic to a sequence of symbols manipulated by rules. Peano's ''The principles of arithmetic, presented by a new method'' (1888) was "the first attempt at an axiomatization of mathematics in a [[Symbolic language (programming)|symbolic language]]". But Heijenoort gives Frege (1879) this kudos: Frege's is "perhaps the most important single work ever written in logic. ... in which we see a " 'formula language', that is a ''lingua characterica'', a language written with special symbols, "for pure thought", that is, free from rhetorical embellishments ... constructed from specific symbols that are manipulated according to definite rules". The work of Frege was further simplified and amplified by [[Alfred North Whitehead]] and [[Bertrand Russell]] in their [[Principia Mathematica]] (1910–1913). ''The paradoxes'': At the same time a number of disturbing paradoxes appeared in the literature, in particular, the [[Burali-Forti paradox]] (1897), the [[Russell paradox]] (1902–03), and the [[Richard Paradox]]. The resultant considerations led to [[Kurt Gödel]]'s paper (1931)—he specifically cites the paradox of the liar—that completely reduces rules of [[recursion]] to numbers. ''Effective calculability'': In an effort to solve the [[Entscheidungsproblem]] defined precisely by Hilbert in 1928, mathematicians first set about to define what was meant by an "effective method" or "effective calculation" or "effective calculability" (i.e., a calculation that would succeed). In rapid succession the following appeared: [[Alonzo Church]], [[Stephen Kleene]] and [[J.B. Rosser]]'s [[λ-calculus]] a finely honed definition of "general recursion" from the work of Gödel acting on suggestions of [[Jacques Herbrand]] (cf. Gödel's Princeton lectures of 1934) and subsequent simplifications by Kleene. Church's proof that the Entscheidungsproblem was unsolvable, [[Emil Post]]'s definition of effective calculability as a worker mindlessly following a list of instructions to move left or right through a sequence of rooms and while there either mark or erase a paper or observe the paper and make a yes-no decision about the next instruction. Alan Turing's proof of that the Entscheidungsproblem was unsolvable by use of his "a- [automatic-] machine"—in effect almost identical to Post's "formulation", [[J. Barkley Rosser]]'s definition of "effective method" in terms of "a machine". Kleene's proposal of a precursor to "[[Church thesis]]" that he called "Thesis I", and a few years later Kleene's renaming his Thesis "Church's Thesis" and proposing "Turing's Thesis".
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Emil Post (1936) and Alan Turing (1936–37, 1939)" ]
[[Emil Post]] (1936) described the actions of a "computer" (human being) as follows: "...two concepts are involved: that of a ''symbol space'' in which the work leading from problem to answer is to be carried out, and a fixed unalterable ''set of directions''. His symbol space would be "a two-way infinite sequence of spaces or boxes... The problem solver or worker is to move and work in this symbol space, being capable of being in, and operating in but one box at a time... a box is to admit of but two possible conditions, i.e., being empty or unmarked, and having a single mark in it, say a vertical stroke. "One box is to be singled out and called the starting point. ...a specific problem is to be given in symbolic form by a finite number of boxes [i.e., INPUT] being marked with a stroke. Likewise, the answer [i.e., OUTPUT] is to be given in symbolic form by such a configuration of marked boxes... "A set of directions applicable to a general problem sets up a deterministic process when applied to each specific problem. This process terminates only when it comes to the direction of type (C ) [i.e., STOP]". See more at [[Post–Turing machine]] [[Alan Turing]]'s work preceded that of Stibitz (1937); it is unknown whether Stibitz knew of the work of Turing. Turing's biographer believed that Turing's use of a typewriter-like model derived from a youthful interest: "Alan had dreamt of inventing typewriters as a boy; Mrs. Turing had a typewriter, and he could well have begun by asking himself what was meant by calling a typewriter 'mechanical'". Given the prevalence of Morse code and telegraphy, ticker tape machines, and teletypewriters we might conjecture that all were influences. Turing—his model of computation is now called a [[Turing machine]]—begins, as did Post, with an analysis of a human computer that he whittles down to a simple set of basic motions and "states of mind". But he continues a step further and creates a machine as a model of computation of numbers. "Computing is normally done by writing certain symbols on paper. We may suppose this paper is divided into squares like a child's arithmetic book...I assume then that the computation is carried out on one-dimensional paper, i.e., on a tape divided into squares. I shall also suppose that the number of symbols which may be printed is finite... "The behavior of the computer at any moment is determined by the symbols which he is observing, and his "state of mind" at that moment. We may suppose that there is a bound B to the number of symbols or squares which the computer can observe at one moment. If he wishes to observe more, he must use successive observations. We will also suppose that the number of states of mind which need be taken into account is finite... "Let us imagine that the operations performed by the computer to be split up into 'simple operations' which are so elementary that it is not easy to imagine them further divided."
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "Emil Post (1936) and Alan Turing (1936–37, 1939)" ]
Turing's reduction yields the following: "The simple operations must therefore include: "(a) Changes of the symbol on one of the observed squares "(b) Changes of one of the squares observed to another square within L squares of one of the previously observed squares. "It may be that some of these change necessarily invoke a change of state of mind. The most general single operation must, therefore, be taken to be one of the following: "(A) A possible change (a) of symbol together with a possible change of state of mind. "(B) A possible change (b) of observed squares, together with a possible change of state of mind" "We may now construct a machine to do the work of this computer." A few years later, Turing expanded his analysis (thesis, definition) with this forceful expression of it: "A function is said to be "effectively calculable" if its values can be found by some purely mechanical process. Though it is fairly easy to get an intuitive grasp of this idea, it is nevertheless desirable to have some more definite, mathematical expressible definition ... [he discusses the history of the definition pretty much as presented above with respect to Gödel, Herbrand, Kleene, Church, Turing, and Post] ... We may take this statement literally, understanding by a purely mechanical process one which could be carried out by a machine. It is possible to give a mathematical description, in a certain normal form, of the structures of these machines. The development of these ideas leads to the author's definition of a computable function, and to an identification of computability † with effective calculability ... . "† We shall use the expression "computable function" to mean a function calculable by a machine, and we let "effectively calculable" refer to the intuitive idea without particular identification with any one of these definitions".
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "J.B. Rosser (1939) and S.C. Kleene (1943)" ]
[[J. Barkley Rosser]] defined an 'effective [mathematical] method' in the following manner (italicization added): "'Effective method' is used here in the rather special sense of a method each step of which is precisely determined and which is certain to produce the answer in a finite number of steps. With this special meaning, three different precise definitions have been given to date. [his footnote #5; see discussion immediately below]. The simplest of these to state (due to Post and Turing) says essentially that ''an effective method of solving certain sets of problems exists if one can build a machine which will then solve any problem of the set with no human intervention beyond inserting the question and (later) reading the answer''. All three definitions are equivalent, so it doesn't matter which one is used. Moreover, the fact that all three are equivalent is a very strong argument for the correctness of any one." (Rosser 1939:225–226) Rosser's footnote No. 5 references the work of (1) Church and Kleene and their definition of λ-definability, in particular Church's use of it in his ''An Unsolvable Problem of Elementary Number Theory'' (1936); (2) Herbrand and Gödel and their use of recursion in particular Gödel's use in his famous paper ''On Formally Undecidable Propositions of Principia Mathematica and Related Systems I'' (1931); and (3) Post (1936) and Turing (1936–37) in their mechanism-models of computation. [[Stephen C. Kleene]] defined as his now-famous "Thesis I" known as the [[Church–Turing thesis]]. But he did this in the following context (boldface in original): "12. ''Algorithmic theories''... In setting up a complete algorithmic theory, what we do is to describe a procedure, performable for each set of values of the independent variables, which procedure necessarily terminates and in such manner that from the outcome we can read a definite answer, "yes" or "no," to the question, "is the predicate value true?"" (Kleene 1943:273)
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[ "History: Development of the notion of \"algorithm\"", "History after 1950" ]
A number of efforts have been directed toward further refinement of the definition of "algorithm", and activity is on-going because of issues surrounding, in particular, [[foundations of mathematics]] (especially the [[Church–Turing thesis]]) and [[philosophy of mind]] (especially arguments about [[artificial intelligence]]). For more, see [[Algorithm characterizations]].
775
Algorithm
[ "Algorithms", "Articles with example pseudocode", "Mathematical logic", "Theoretical computer science" ]
[ "Abstract machine", "Computational complexity theory", "Algorithmic entities", "Algorithm characterizations", "Algorithm engineering", "Algorithmic synthesis", "Algorithmic technique", "Introduction to Algorithms", "List of important publications in theoretical computer science – Algorithms", "Theory of computation", "Computability theory", "Algorithmic topology", "List of algorithms", "Garbage in, garbage out", "List of algorithm general topics", "Regulation of algorithms", "Algorithmic composition" ]
[]
[[Image:Doperwt rijserwt peulen Pisum sativum.jpg|right|thumb|240px|[[Pea]]s are an annual plant.]] An '''annual plant''' is a plant that completes its [[biological life cycle|life cycle]], from [[germination]] to the production of [[seed]], within one [[growing season]], and then dies. The length of growing seasons and period in which they take place vary according to geographical location, and may not correspond to the four traditional seasonal divisions of the year. With respect to the traditional seasons annual plants are generally categorized into summer annuals and winter annuals. Summer annuals germinate during spring or early summer and mature by autumn of the same year. Winter annuals germinate during the autumn and mature during the spring or summer of the following calendar year. One seed-to-seed life cycle for an annual can occur in as little as a month in some species, though most last several months. [[Brassica rapa|Oilseed rapa]] can go from seed-to-seed in about five weeks under a bank of [[fluorescent lamp]]. This style of growing is often used in classrooms for education. Many desert annuals are [[therophyte]], because their seed-to-seed life cycle is only weeks and they spend most of the year as seeds to survive dry conditions.
777
Annual plant
[ "Annual plants", "Garden plants" ]
[]
[ "Cultivation" ]
In cultivation, many food plants are, or are grown as, annuals, including virtually all domesticated [[Cereal|grain]]. Some [[perennial plant|perennials]] and [[biennial plant|biennials]] are grown in gardens as annuals for convenience, particularly if they are not considered [[cold hardy]] for the local climate. [[Carrot]], [[celery]] and [[parsley]] are true biennials (divarsiya) that are usually grown as annual crops for their edible roots, petioles and leaves, respectively. [[Tomato]], [[sweet potato]] and [[bell pepper]] are tender perennials usually grown as annuals. Ornamental perennials commonly grown as annuals are [[impatiens]], [[Mirabilis (plant)|mirabilis]], [[begonia|wax begonia]], [[Antirrhinum|snapdragon]], ''[[pelargonium]]'', [[coleus]] and [[petunia]]. Examples of true annuals include [[maize|corn]], [[wheat]], [[rice]], [[lettuce]], [[pea]], [[watermelon]], [[bean]], [[zinnia]] and [[Tagetes|marigold]].
777
Annual plant
[ "Annual plants", "Garden plants" ]
[]
[ "Summer" ]
'''Summer annuals''' sprout, flower, produce seed, and die, during the warmer months of the year. The lawn weed [[crabgrass]] is a summer annual.
777
Annual plant
[ "Annual plants", "Garden plants" ]
[]
[ "Winter" ]
'''Winter annuals''' germinate in autumn or winter, live through the winter, then bloom in winter or spring. The plants grow and bloom during the cool season when most other plants are dormant or other annuals are in seed form waiting for warmer weather to germinate. Winter annuals die after flowering and setting seed. The seeds germinate in the autumn or winter when the soil temperature is cool. Winter annuals typically grow low to the ground, where they are usually sheltered from the coldest nights by snow cover, and make use of warm periods in winter for growth when the snow melts. Some common winter annuals include [[Lamium amplexicaule|henbit]], [[deadnettle]], [[chickweed]], and [[winter cress]]. Winter annuals are important ecologically, as they provide vegetative cover that prevents soil erosion during winter and early spring when no other cover exists and they provide fresh vegetation for animals and birds that feed on them. Although they are often considered to be weeds in gardens, this viewpoint is not always necessary, as most of them die when the soil temperature warms up again in early to late spring when other plants are still dormant and have not yet leafed out. Even though they do not compete directly with cultivated plants, sometimes winter annuals are considered a pest in commercial agriculture, because they can be hosts for insect pests or fungal diseases (ovary smut – Microbotryum sp) which attack crops being cultivated. The property that they prevent the soil from drying out can also be problematic for commercial agriculture.
777
Annual plant
[ "Annual plants", "Garden plants" ]
[]
[ "Molecular genetics" ]
In 2008, it was discovered that the inactivation of only two genes in one species of annual plant leads to the conversion into a [[perennial plant]]. Researchers deactivated the SOC1 and FUL genes in ''[[Arabidopsis thaliana]]'', which control flowering time. This switch established [[phenotypes]] common in perennial plants, such as wood formation.
777
Annual plant
[ "Annual plants", "Garden plants" ]
[]
[]
The '''anthophytes''' are a grouping of plant taxa bearing flower-like reproductive structures. They were formerly thought to be a [[clade]] comprising plants bearing flower-like structures. The group contained the [[Flowering plant|angiosperms]] - the extant flowering plants, such as [[Rosaceae|roses]] and [[Poaceae|grasses]] - as well as the [[Gnetales]] and the extinct [[Bennettitales]]. Detailed morphological and molecular studies have shown that the group is not actually [[monophyletic]], with proposed floral homologies of the [[gnetophyte]] and the [[angiosperm]] having evolved in parallel. This makes it easier to reconcile molecular clock data that suggests that the angiosperms diverged from the [[gymnosperm]] around . Some more recent studies have used the word anthophyte to describe a group which includes the angiosperms and a variety of fossils ([[Glossopteris|glossopterids]], ''[[Pentoxylon]]'', [[Bennettitales]], and ''[[Caytonia]]''), but not the Gnetales.
779
Anthophyta
[ "Historically recognized plant taxa" ]
[]
[]
An '''[[atlas]]''' is a collection of maps, originally named after the Ancient Greek deity. '''Atlas''' may also refer to:
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Mythology" ]
(-) [[Atlas (mythology)]], an Ancient Greek Titanic deity who held up the celestial sphere (-) Atlas, the first legendary king of [[Atlantis]] and further variant of the mythical Titan (-) [[Atlas of Mauretania]], a legendary king of Mauretania and variant of the mythical Titan
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Places", "United States" ]
(-) [[Atlas, California]] (-) [[Atlas, Illinois]] (-) [[Atlas, Texas]] (-) [[Atlas, West Virginia]] (-) [[Atlas, Wisconsin]] (-) [[Atlas District]], an area in Washington, D.C. (-) [[Atlas Peak AVA]], a California wine region (-) [[Atlas Township, Michigan]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Places", "Other" ]
(-) [[Atlas Cinema]], a historic movie thatre in Istanbul, Turkey (-) [[Atlas Mountains]], a set of mountain ranges in northwestern Africa (-) [[Atlas, Nilüfer]], a village in Nilüfer district of Bursa Province, Turkey
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "People with the name" ]
(-) [[Atlas (graffiti artist)]], American graffiti artist (-) [[Atlas DaBone]], American wrestler and football player (-) [[Charles Atlas]] (1892–1972), Italian-American bodybuilder (-) [[Charles Atlas (artist)]] (-) [[David Atlas]] (born 1924), American meteorologist who pioneered weather radar (-) [[James Atlas]] (1949-2019), American writer, editor and publisher (-) [[Meir Atlas]] (1848–1926), Lithuanian rabbi (-) [[Natacha Atlas]] (born 1964), Belgian singer (-) [[Nava Atlas]], American book artist and author (-) [[Omar Atlas]] (born 1938), former Venezuelan professional wrestler (-) [[Scott Atlas]] (born 1955), American conservative health care policy advisor (-) [[Teddy Atlas]] (born 1956), American boxing trainer and commentator (-) [[Tony Atlas]] (born 1954), American wrestler and bodybuilder
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Comics" ]
(-) [[Atlas (Drawn and Quarterly)|''Atlas'' (Drawn and Quarterly)]], a comic book series by Dylan Horrocks (-) ''[[Agents of Atlas]]'', a Marvel Comics mini-series (-) [[Atlas Comics (1950s)]], a publisher (-) [[Atlas/Seaboard Comics]], a 1970s line of comics
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Fictional characters" ]
(-) [[Atlas (DC Comics)]], the name of several of DC Comics' fictional characters, comic book superheroes, and deities (-) [[Atlas (Teen Titans)|Atlas (''Teen Titans'')]], ''Teen Titans'' character (-) Atlas, an [[Astro Boy (1980 TV series)#Atlas|''Astro Boy'' character]] (-) Atlas, a ''[[BioShock (series)|BioShock]]'' character (-) Atlas, a [[BattleMech]] in the ''BattleTech'' universe (-) Atlas, an antagonist in ''[[Mega Man ZX Advent]]'' (-) Atlas, a ''[[Portal 2]]'' character (-) Atlas, a ''[[PS238]]'' character (-) [[Erik Josten]], a.k.a. Atlas, a Marvel Comics supervillain (-) The Atlas, a strong driving force from ''[[No Man's Sky]]''
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Literature" ]
(-) ''Atlas'', a photography book by [[Gerhard Richter]] (-) ''[[ATLAS of Finite Groups]]'', a group theory book (-) ''[[Atlas Shrugged]]'', a novel by Ayn Rand (-) [[The Atlas (novel)|''The Atlas'' (novel)]], by William T. Vollmann
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Music", "Groups" ]
(-) [[Atlas (band)]], a New Zealand rock band (-) [[Atlas Sound]], the solo musical project of Bradford Cox, lead singer and guitarist of the indie rock band Deerhunter
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Music", "Albums" ]
(-) [[Atlas (Kinky album)|''Atlas'' (Kinky album)]] (-) [[Atlas (Parkway Drive album)|''Atlas'' (Parkway Drive album)]], Parkway Drive's fourth album (-) [[Atlas (Real Estate album)|''Atlas'' (Real Estate album)]] (-) [[Atlas (RÜFÜS album)|''Atlas'' (RÜFÜS album)]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Music", "Operas" ]
(-) [[Atlas (opera)|''Atlas'' (opera)]], 1991 opera by Meredith Monk (-) ''[[Atlas: An Opera in Three Parts]]'', 1993 recording of Monk's opera
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Music", "Songs" ]
(-) [[Atlas (Battles song)|"Atlas" (Battles song)]], 2007 song by Battles on the album ''Mirrored'' (-) [[Atlas (Coldplay song)|"Atlas" (Coldplay song)]], 2013 song by Coldplay from ''The Hunger Games: Catching Fire'' soundtrack (-) "Atlas", a song by Caligula's Horse from the album ''[[The Tide, the Thief & River's End]]'' (-) "Atlas", the titular song from [[Parkway Drive]]'s fourth album (-) "Atlas", a song by Man Overboard from ''[[Man Overboard (Man Overboard album)|Man Overboard]]'' (-) "Atlas", a song by Jake Chudnow used as main theme in the YouTube series ''[[Mind Field]]''
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Periodicals" ]
(-) [[Atlas (magazine)|''Atlas'' (magazine)]] (-) [[The Atlas (newspaper)|''The Atlas'']], a newspaper published in England from 1826 to 1869
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Arts, entertainment, and media", "Other uses in arts, entertainment, and media" ]
(-) [[Atlas (film)|''Atlas'' (film)]] (-) [[Atlas (statue)|''Atlas'' (statue)]], iconic statue by Lee Lawrie in Rockefeller Center (-) Atlas, a book about flora and/or fauna of a region, such as [[atlases of the flora and fauna of Britain and Ireland]] (-) [[Atlas Entertainment]], a film production company (-) Atlas folio, a [[book size]] (-) [[Atlas Media Corp.]], a non-fiction entertainment company (-) [[Atlas Press]], a UK publisher (-) [[RTV Atlas]], a broadcaster in Montenegro (-) Atlas Sound, a solo musical project by [[Bradford Cox#Atlas Sound|Bradford Cox]] (-) [[The Atlas (video game)|''The Atlas'' (video game)]], a 1991 multiplatform strategy video game (-) [[Atlas (video game)|''Atlas'' (video game)]], an upcoming massively-multiplayer online video game (-) Atlas Corporation, a fictional arms manufacturer in the video game series ''[[Borderlands (series)]]''
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Brands and enterprises" ]
(-) [[Atlas (appliance company)]], a Belarusian company (-) [[Atlas Consortium]], a group of technology companies (-) [[Atlas Copco]], Swedish company founded in 1873 (-) [[Atlas Corporation]], an investment company (-) [[Atlas Elektronik]], a German naval/marine electronics and systems business (-) [[Atlas Group]], a Pakistani business group (-) [[Atlas Mara Limited]], formerly Atlas Mara Co-Nvest Limited, a financial holding company that owns banks in Africa (-) [[Atlas Model Railroad]], American maker of model trains and accessories (-) [[Atlas Network]], formerly Atlas Economic Research Foundation (-) [[Atlas Press (tool company)]] (-) [[Atlas Solutions]], a subsidiary of Facebook for digital online advertising, formerly owned by Microsoft (-) [[Atlas Telecom]], a worldwide communications company (-) [[Atlas Van Lines]], a moving company (-) [[Atlas-Imperial]], an American diesel engine manufacturer (-) [[Dresser Atlas]], a provider of oilfield and factory automation services (-) [[Tele Atlas]], a Dutch mapping company (-) [[Western Atlas]], an oilfield services company
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Computing and technology" ]
(-) [[Atlas (computer)]], an early supercomputer, built in the 1960s (-) [[Atlas (robot)]], a humanoid robot developed by Boston Dynamics and DARPA (-) Atlas, a computer used at the [[Lawrence Livermore National Laboratory]] in 2006 (-) [[Abbreviated Test Language for All Systems]], or ATLAS, a MILSPEC language for avionics equipment testing (-) [[Advanced Technology Leisure Application Simulator]], or ATLAS, a hydraulic motion simulator used in theme parks (-) [[ASP.NET AJAX]] (formerly "Atlas"), a set of ASP.NET extensions (-) [[ATLAS Transformation Language]], programming language (-) [[Atlas.ti]], a qualitative analysis program (-) [[Automatically Tuned Linear Algebra Software]], or ATLAS, (-) [[Texture atlas]], or image sprite sheet (-) [[UNIVAC 1101]], an early American computer, built in the 1950s
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Science", "Astronomy" ]
(-) [[Atlas (comet)]] (C/2019 Y4) (-) [[Atlas (crater)]] on the near side of the Moon (-) [[Atlas (moon)]], a satellite of Saturn (-) [[Atlas (star)]], also designated 27 Tauri, a triple star system in the constellation of Taurus and a member of the Pleiades (-) [[Advanced Technology Large-Aperture Space Telescope]] (ATLAST) (-) Advanced Topographic Laser Altimeter System (ATLAS), a space-based lidar instrument on [[ICESat-2]] (-) [[Asteroid Terrestrial-impact Last Alert System]] (ATLAS)
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Science", "Mathematics" ]
(-) [[Manifold#Atlases|Atlas (manifolds)]], a set of smooth charts (-) [[Atlas (topology)]], a set of charts (-) [[Smooth structure|Smooth atlas]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Science", "Physics" ]
(-) [[Argonne Tandem Linear Accelerator System]], or ATLAS, a linear accelerator at the Argonne National Laboratory (-) [[ATLAS experiment]], a particle detector for the Large Hadron Collider at CERN (-) [[Atomic-terrace low-angle shadowing]], or ATLAS, a nanofabrication technique
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Science", "Biology and healthcare" ]
(-) [[Atlas (anatomy)]], part of the spine (-) [[Atlas personality]], a term used in psychology to describe the personality of someone whose childhood was characterized by excessive responsibilities (-) [[Brain atlas]], a neuroanatomical map of the brain of a human or other animal
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Science", "Biology and healthcare", "Animals and plants" ]
(-) [[Atlas bear]] (-) [[Atlas beetle]] (-) [[Atlas cedar]] (-) [[Atlas moth]] (-) [[Atlas pied flycatcher]], a bird (-) [[Atlas turtle]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Sport" ]
(-) [[Atlas Delmenhorst]], a German association football club (-) [[Atlas F.C.]], a Mexican professional football club (-) [[Club Atlético Atlas]], an Argentine amateur football club (-) [[KK Atlas]], a former men's professional basketball club based in Belgrade (today's Serbia)
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Transport", "Aerospace" ]
(-) [[Atlas (rocket family)]] (-) [[SM-65 Atlas]] intercontinental ballistic missile (ICBM) (-) [[AeroVelo Atlas]], a human-powered helicopter (-) [[Airbus A400M Atlas]], a military aircraft produced 2007–present (-) [[Armstrong Whitworth Atlas]], a British military aeroplane produced 1927–1933 (-) [[Atlas Air]], an American cargo airline (-) [[Atlas Aircraft]], a 1940s aircraft manufacturer (-) [[Atlas Aircraft Corporation]], a South African military aircraft manufacturer (-) [[Atlas Aviation]], an aircraft maintenance firm (-) [[Atlas Blue]], a Moroccan low-cost airline (-) [[Atlasjet]], a Turkish airline (-) [[Birdman Atlas]], an ultralight aircraft (-) [[HMLAT-303]], U.S. Marine Corps helicopter training squadron (-) [[La Mouette Atlas]], a French hang glider design
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Transport", "Automotive" ]
(-) [[Atlas (1951 automobile)]], a French mini-car (-) [[Atlas (light trucks)]], a Greek motor vehicle manufacturer (-) [[Atlas (Pittsburgh automobile)]], produced 1906–1907 (-) [[Atlas (Springfield automobile)]], produced 1907–1913 (-) Atlas, a British van by the [[Standard Motor Company]] produced 1958–1962 (-) Atlas Drop Forge Company, a parts subsidiary of [[REO Motor Car Company]] (-) [[Atlas Motor Buggy]], an American highwheeler produced in 1909 (-) [[General Motors Atlas engine]] (-) [[Honda Atlas Cars Pakistan]], a Pakistani car manufacturer (-) [[Nissan Atlas]], a Japanese light truck (-) [[Volkswagen Atlas]], a sport utility vehicle (-) [[Geely Atlas]], a sport utility vehicle
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Transport", "Ships and boats" ]
(-) [[Atlas Werke]], a former German shipbuilding company (-) , the name of several Royal Navy ships (-) [[ST Atlas|ST ''Atlas'']], a Swedish tugboat (-) , the name of several U.S. Navy ships
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Transport", "Trains" ]
(-) Atlas, an 1863–1885 [[South Devon Railway Dido class]] locomotive (-) Atlas, a 1927–1962 [[LMS Royal Scot Class]] locomotive (-) [[Atlas Car and Manufacturing Company]], a locomotive manufacturer (-) [[Atlas Model Railroad]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[ "Other uses" ]
(-) [[Atlas (architecture)]] (-) [[ATLAS (simulation)]] (Army Tactical Level Advanced Simulation), a Thai military system (-) [[Atlas (storm)]], which hit the Midwestern United States in October 2013, named by The Weather Channel (-) [[Agrupación de Trabajadores Latinoamericanos Sindicalistas]], or ATLAS, a former Latin American trade union confederation in the early 1950s (-) [[Atlas languages]], Berber languages spoken in the Atlas Mountains of Morocco (-) [[ATLAS Network]], a network of European special police units (-) [[Atlas Uranium Mill]]
780
Atlas (disambiguation)
[]
[ "Altas (disambiguation)" ]
[]
'''Mouthwash''', '''mouth rinse''', '''oral rinse''', or '''mouth bath''' is a liquid which is held in the mouth passively or swilled around the mouth by contraction of the perioral muscles and/or movement of the head, and may be [[gargling|gargled]], where the head is tilted back and the liquid bubbled at the back of the mouth. Usually mouthwashes are [[antiseptic]] solutions intended to reduce the microbial load in the oral cavity, although other mouthwashes might be given for other reasons such as for their [[analgesic]], [[anti-inflammatory]] or [[anti-fungal medication|anti-fungal]] action. Additionally, some rinses act as saliva substitutes to neutralize acid and keep the mouth moist in [[xerostomia]] (dry mouth). Cosmetic mouthrinses temporarily control or reduce bad breath and leave the mouth with a pleasant taste. Rinsing with water or mouthwash after brushing with a [[Toothpaste#Fluorides|fluoride toothpaste]] can reduce the availability of salivary fluoride. This can lower the anti-cavity re-mineralization and antibacterial effects of fluoride. Fluoridated mouthwash may mitigate this effect or in high concentrations increase available fluoride, but is not as cost effective as leaving the fluoride toothpaste on the teeth after brushing. A group of experts discussing post brushing rinsing in 2012 found that although there was clear guidance given in many public health advice publications to "spit, avoid rinsing with water/excessive rinsing with water" they believed there was a limited evidence base for best practice.
782
Mouthwash
[ "Dentifrices", "Oral hygiene", "Drug delivery devices", "Dosage forms" ]
[]
[ "Use" ]
Common use involves rinsing the mouth with about 20-50 [[milliliter|ml]] (2/3 [[fluid ounce|fl oz]]) of mouthwash. The wash is typically swished or gargled for about half a minute and then spat out. Most companies suggest not drinking water immediately after using mouthwash. In some brands, the [[expectorate]] is stained, so that one can see the bacteria and debris. Mouthwash should not be used immediately after brushing the teeth so as not to wash away the beneficial fluoride residue left from the toothpaste. Similarly, the mouth should not be rinsed out with water after brushing. Patients were told to "spit don't rinse" after toothbrushing as part of a [[National Health Service]] campaign in the UK. A fluoride mouthrinse can be used at a different time of the day to brushing. Gargling is where the head is tilted back, allowing the mouthwash to sit in the back of the mouth while exhaling, causing the liquid to bubble. Gargling is practiced in [[Japan]] for perceived prevention of viral infection. One commonly used way is with [[infusion]] or [[tea]]. In some cultures, gargling is usually done in private, typically in a [[bathroom]] at a sink so the liquid can be rinsed away.
782
Mouthwash
[ "Dentifrices", "Oral hygiene", "Drug delivery devices", "Dosage forms" ]
[]
[ "Benefits and side effects" ]
The most common use of mouthwash is commercial antiseptics, which are used at home as part of an [[oral hygiene]] routine. Mouthwashes combine ingredients to treat a variety of oral conditions. Variations are common, and mouthwash has no standard formulation so its use and recommendation involves concerns about [[patient safety]]. Some manufacturers of mouthwash state that antiseptic and anti-plaque mouth rinse kill the [[Dental plaque|bacterial plaque]] that causes [[Dental caries|cavities]], [[gingivitis]], and [[bad breath]]. It is, however, generally agreed that the use of mouthwash does not eliminate the need for both [[toothbrush|brushing]] and [[flossing]]. The [[American Dental Association]] asserts that regular brushing and proper flossing are enough in most cases, in addition to regular dental check-ups, although they approve many mouthwashes. For many patients, however, the mechanical methods could be tedious and time-consuming and additionally some local conditions may render them especially difficult. Chemotherapeutic agents, including mouthrinses, could have a key role as adjuncts to daily home care, preventing and controlling supragingival plaque, gingivitis and oral malodor. Minor and transient side effects of mouthwashes are very common, such as [[Dysgeusia|taste disturbance]], tooth staining, [[xerostomia|sensation of a dry mouth]], etc. Alcohol-containing mouthwashes may make dry mouth and halitosis worse since it dries out the mouth. Soreness, ulceration and redness may sometimes occur (e.g. [[aphthous stomatitis]], [[allergic contact stomatitis]]) if the person is allergic or sensitive to mouthwash ingredients such as preservatives, coloring, flavors and fragrances. Such effects might be reduced or eliminated by diluting the mouthwash with water, using a different mouthwash (e.g. salt water), or foregoing mouthwash entirely. Prescription mouthwashes are used prior to and after oral surgery procedures such as [[tooth extraction]] or to treat the pain associated with [[mucositis]] caused by [[radiation therapy]] or [[chemotherapy]]. They are also prescribed for [[aphthous ulcer]], other [[oral ulcer]], and other mouth pain. Magic mouthwashes are prescription mouthwashes [[Compounding|compounded]] in a [[pharmacy]] from a list of ingredients specified by a doctor. Despite a lack of evidence that prescription mouthwashes are more effective in decreasing the pain of oral [[lesion]], many patients and prescribers continue to use them. There has been only one [[Controlled experiment|controlled study]] to evaluate the [[efficacy]] of magic mouthwash; it shows no difference in efficacy among the most common formulation and commercial mouthwashes such as [[chlorhexidine]] or a [[Saline (medicine)|saline]]/[[baking soda]] solution. Current guidelines suggest that saline solution is just as effective as magic mouthwash in pain relief or shortening of healing time of oral mucositis from cancer therapies.
782
Mouthwash
[ "Dentifrices", "Oral hygiene", "Drug delivery devices", "Dosage forms" ]
[]
[ "History" ]
The first known references to mouth rinsing is in [[Ayurveda]] for treatment of gingivitis. Later, in the [[ancient Greece|Greek]] and [[Ancient Rome|Roman]] periods, mouth rinsing following mechanical cleansing became common among the upper classes, and [[Hippocrates]] recommended a mixture of salt, [[alum]], and vinegar. The Jewish [[Talmud]], dating back about 1,800 years, suggests a cure for gum ailments containing "dough water" and olive oil. Before Europeans came to the Americas, Native North American and Mesoamerican cultures used mouthwashes, often made from plants such as ''[[Coptis trifolia]]''. Indeed, [[Aztec]] dentistry was more advanced than European dentistry of the age. Peoples of the Americas used salt water mouthwashes for sore throats, and other mouthwashes for problems such as [[teething]] and mouth ulcers. [[Anton van Leeuwenhoek]], the famous 17th century [[Microscopy|microscopist]], discovered living organisms (living, because they were mobile) in deposits on the teeth (what we now call [[dental plaque]]). He also found organisms in water from the canal next to his home in Delft. He experimented with samples by adding vinegar or brandy and found that this resulted in the immediate immobilization or killing of the organisms suspended in water. Next he tried rinsing the mouth of himself and somebody else with a mouthwash containing vinegar or brandy and found that living organisms remained in the dental plaque. He concluded—correctly—that the mouthwash either did not reach, or was not present long enough, to kill the plaque organisms. In 1892, German [[Richard Seifert (inventor)|Richard Seifert]] invented mouthwash product [[Odol]], which was produced by company founder [[Karl August Lingner]] (1861–1916) in [[Dresden]]. That remained the state of affairs until the late 1960s when Harald Loe (at the time a professor at the [[Aarhus University|Royal Dental College]] in [[Aarhus]], [[Denmark]]) demonstrated that a [[chlorhexidine]] compound could prevent the build-up of dental plaque. The reason for chlorhexidine's effectiveness is that it strongly adheres to surfaces in the mouth and thus remains present in effective concentrations for many hours. Since then commercial interest in mouthwashes has been intense and several newer products claim effectiveness in reducing the build-up in dental plaque and the associated severity of gingivitis, in addition to fighting bad breath. Many of these solutions aim to control the Volatile Sulfur Compound (VSC)-creating anaerobic bacteria that live in the mouth and excrete substances that lead to bad breath and unpleasant mouth taste. For example, the number of mouthwash variants in the United States of America has grown from 15 (1970) to 66 (1998) to 113 (2012).
782
Mouthwash
[ "Dentifrices", "Oral hygiene", "Drug delivery devices", "Dosage forms" ]
[]