Wednesday, March 22, 2017

Computer Palace (Atari 400/800)

Computer Palace (Atari 400/800)

Trump’s Budget Paves the Road to Fiscal Failure

Trump’s Budget Paves the Road to Fiscal Failure

Trump's Budget Paves the Road to Fiscal Failure

President Donald Trump has issued his preliminary federal budget proposal looking to the U.S. government’s next fiscal year. What it shows is that there will likely be no attempt to reduce the size and cost of most of the American interventionist-welfare state.

On Thursday, March 16, 2017, the White House released, “America First: A Budget Blueprint to Make America Great Again.” Listening to the comments of some on the political left, you would think that the world was going to come to an end. For many on the political right, the programs placed on the chopping block for reduction or near elimination seem like a dream come true–if the budgetary proposals were to be implemented.

Furthermore, the blueprint offers an insight into the mind of Donald Trump about the role of government in society. When the budget was released, Michael Mulvaney, the director of the Office of Management and Budget, said that this was Donald Trump’s fiscal vision for America. "If he said it on the campaign, it's in the budget," Mulvaney declared. "We wrote it using the president's own words."

Same Entitlements, More Defense Spending

Even a cursory look at President Trump’s budgetary proposals reveals that he plans to leave “entitlement programs” untouched while reallocating approximately 30 percent of the federal budget’s “discretionary” expenditures from one set of activities to another. Neither the total amount of government spending nor the likely budget deficit is threatened with meaningful reduction.

In the current 2017 federal fiscal year, Social Security, Medicare, and related spending make up almost 64 percent of Uncle Sam’s expenditures. The net interest on the near $20 trillion national debt makes up another 7 percent of federal spending. Out of the remaining around 30 percent of the budget, defense spending absorbs 15 percent of federal outflows.

The budget proposal makes it clear that President Trump is devoted to expanding military capabilities for continued foreign intervention. A foreign policy focused on “America First” is losing none of its global reach or the military hardware to back it up.

During his March 17 press conference with visiting German Chancellor, Angela Merkel, Donald Trump reiterated that he was not a foreign policy isolationist. Indeed, he emphasized his allegiance to NATO and its role in Europe. At the same time, Secretary of State, Rex Tillerson, was at the demilitarized zone between North Korea and South Korea, declaring that nothing was off the table, including a preemptive military attack on North Korea’s nuclear capability.

For conservatives and classical liberals who hope for foreign policy that leaves the United States less vulnerable to regional foreign conflicts, President Trump and his cabinet members are making it clear that America’s political and military allies must pick up more of the financial tab for the joint policing of different parts of the world.

Reflecting this, the president’s blueprint proposes to increase Defense Department spending by $54 billion dollars, which would put military expenditures for 2018 at a total of $603 billion. The Department of Homeland Security would gain an additional $2.8 billion dollars for a total in 2018 of around $70 billion.

The eyes and ears of the surveillance state will, also, remain intact and grow. The only wiretapping that President Trump seems to mind was an alleged eavesdropping on his own conversations before he took office. As for the rest of us, well, Big Brother is watching and listening–for our own good. After all, it’s all part of making America “great” and “safe” again.

Cue Progressive Whining

To pay for increases in the warfare state, President Trump’s budgetary axe has fallen on a variety of “discretionary” welfare and redistributive programs. To cover the $54 billion increase in defense spending, $54 billion is to be cut from half of the of the budgeted 30 percent discretionary spending. It’s worth keeping in mind that all the teeth gnashing by the left is over a less than 1.5 percent decrease to the projected $4 trillion (and then some) that Uncle Sam will spend in 2018.

It must be admitted, conservative and classical liberal hearts can only be warmed by virtually every cut in this part of the budget. For example, Department of Agriculture spending will be reduced by 20.7 percent. However, it is worth observing that subsidies paid to farmers, including subsidies for not growing crops, are not on the chopping block. Trump does not want to antagonize a crucial part of rural Republican America that lives at the trough of government spending.

On the other hand, the State Department and related foreign aid programs would be slashed by almost 29 percent. Not many tears need be shed here, given that State Department programs and personnel are at the heart of America’s misguided global social engineering schemes, and foreign aid is merely a slush fund for foreign political power lusters that undermine real market-oriented economic development in other parts of the world.

This list goes on: Housing and Urban Development, down 12 percent; Health and Human Services, cut 16 percent; Commerce Department, reduced 16 percent; Education Department, decreased by over 13 percent (but with a shift of funds to increase falsely named “school choice” programs). The Interior Department is down almost 12 percent; the Labor Department cut nearly 21 percent.

The Environmental Protection Agency would be cut by over 31 percent. The climate and land-use social engineers are being driven berserk by this one. It is being forecast as the end of planet Earth that swarms of regulatory locusts will be reined in from plaguing the country with their wetland rules, land-use restrictions, market-hampering prohibitions, and abridgments of private property rights. The heavens will darken, the seas will rise, and the land will be barren. How will humanity survive without self-righteous elitists leading mankind to socially-sensitive, greener pastures?

O! The Humanities!

Additionally, the National Endowment for the Arts, the National Endowment for the Humanities, the Institute for Museum and Library Services, and the Corporation for Public Broadcasting are targeted for a virtual 100 percent cut. Those concerned about the arts and humanities may have to put their private money where their mouths are.

The thought that those who listen to the moralizing, collectivist voices on National Public Radio may have to pay for it (either out of their own pockets or from capitalist commercial interruptions) is just too much for these delicate souls to bear.

Political pocket-pickers are warning that planned “Meals on Wheels” spending cuts threaten the poor and aged with starvation. But, in fact, 65 percent of the program’s funding comes from private donations or local and state governments, with only 35 percent funded by federal dollars. Furthermore, the day after the budget blueprint was released, the media reported that Meals on Wheels around the country received a more than 50 percent increase to their regular private donations rate. Private benevolence–amazingly!–materialized almost instantly to replace coercively collected funding with voluntary support for the charity that, apparently, many consider worthy of support.

Leaving the Entitlement State Intact

Donald Trump’s budgetary blueprint for American greatness needs to be put into the wider context. Where does this leave the size and scope of government in the United States?

Alas, Trump’s budget leaves it seemingly untouched. The entitlement programs are feeding the insatiable growth of America’s domestic system of political paternalism: the governmental spending surrounding Social Security and Medicare redistribution.

Under current legislation, their cost and intrusiveness will only get worse. In its January 2017 long-term federal government budgetary forecast, the Congressional Budget Office estimates that if nothing changes legislatively, the “entitlement” programs will end up consuming nearly 80 percent of all the taxes collected by the United States government.

Since the remaining 20 percent of projected federal tax revenues will not sufficiently cover all projected defense and other “discretionary” spending, plus interest on the national debt between 2018 and 2027, the United States government will continue to run large annual budget deficits between now and then. This will add $10 trillion more to the total national debt over next decade.

Donald Trump made it clear during the primary and general presidential election campaigns in 2016 that he considers Social Security and Medicare sacrosanct, not subject to the budget cutter’s chopping block. In addition, ObamaCare may be repealed, but the reform that Trump and the Republican leadership in Congress have in mind will still leave a heavy fiscal footprint. This, too, will maintain and entrench Uncle Sam’s intrusive presence in the healthcare and medical insurance business, and will, inescapably, cost a lot of government dollars, though the full estimates are still forthcoming.

The Proposed Cuts Are Unlikely

Keep in mind that Trump’s budgetary blueprint is merely his administration’s recommendation to Congress, and especially to the House of Representatives where spending legislation is constitutionally supposed to originate. Already the grumbling has begun to be heard, not only from the Democratic Party minority in Congress but from members of the Republican Party majority, as well.

Abstract spending cuts almost always serve as good campaign rhetoric, especially for Republicans running for elected office, but like their Democratic Party counterparts, Republicans soon find themselves pressured and dependent upon the financial support of special interest groups, each of which feeds off of concrete government spending dollars.

The resulting resistance to fiscal repeal and retrenchment turns out to be no different than with the groups surrounding the Democrats. Plus, the Republican foreign policy hawks have all the big-spending military contractors to serve in the name of warding off foreign threats to American greatness.

At the end of the day, when the actual 2018 federal fiscal budget gets passed by Congress and signed by the president, it will no doubt contain fewer of the discretionary spending cuts than proposed in Trump’s blueprint. Other than adding whatever “repeal and reform” emerges out of the contest between ObamaCare and TrumpCare (or RyanCare), the “entitlement” portion of the federal government’s budget will remain untouched.

Challenging the Entitlement Premises

The fact is America is continuing to move in the long-run direction of fiscal unsustainability. The supposed untouchability of the “entitlement” segment of the federal budget will have to be made touchable. Nearly 90 years ago, in 1930, the famous “Austrian” economist, Ludwig von Mises, said to an audience of Viennese industrialists during an earlier economic crisis:
Whenever there is talk about decreasing public expenditures, the advocates of this fiscal spending policy voice their objection, saying that most of the existing expenditures, as well as the increasing expenditures, are inevitable . . . What exactly does ‘inevitable’ mean in this context?

That the expenditures are based on various laws that have been passed in the past is not an objection if the argument for eliminating these laws is based on their damaging effects on the economy. The metaphorical use of the term ‘inevitable’ is nothing but a haven in which to hide in the face of an inability to comprehend the seriousness of our situation. People do not want to accept that fact that the public budget has to be radically reduced.”
If there is any chance of stopping, reversing and repealing the welfare state, the entitlement language in political discourse has to be challenged. “Entitlement” presumes a right to something by some in the society, which in the modern redistributive mindset equally presumes an obligation to others to provide it.

It is essential to emphasize and explain the dollars and cents of the fiscal unsustainability of the entitlement society. And there are certainly a sufficient number of historical examples to point to for demonstration that the welfare state can go down the road to societal ruin.

In addition, the entitlement mindset must be confronted with an articulate and reasoned defense of individual liberty, based on a philosophy of individual rights to life, liberty, and honestly acquired property. Plus, the ethics of liberty must be shown to be inseparable from the idea of peaceful and voluntary association among people in all facets of life, and that government’s role is to secure and protect such liberty and individual rights, not to abridge and violate them.

If this is not done, and done successfully, the road to fiscal failure and paternalistic serfdom may be impossible from which to exit.

Richard M. Ebeling
Richard M. Ebeling is BB&T Distinguished Professor of Ethics and Free Enterprise Leadership at The Citadel in Charleston, South Carolina. He was president of the Foundation for Economic Education (FEE) from 2003 to 2008.

This article was originally published on Read the original article.

Monday, March 20, 2017

Vault 7 Confirms, You’re Right to Be Paranoid

Vault 7 Confirms, You’re Right to Be Paranoid

Vault 7 Confirms, You're Right to Be Paranoid

On March 7, the transparency/disclosure activists at Wikileaks began releasing a series of documents titled “Vault 7.” According to the New York Times, Vault 7 consists of “thousands of pages describing sophisticated software tools and techniques used by the [US Central Intelligence Agency] to break into smartphones, computers and even Internet-connected televisions.”

Stranger Than Fiction

If the documents are authentic — and WikiLeaks has a sterling reputation when it comes to document authenticity — every paranoid thriller you’ve ever watched or read was too timid in describing a hypothetical Surveillance State. Even the telescreens and random audio bugs of George Orwell’s 1984 don’t come close to the reality of the CIA’s surveillance operations.

In theory, the CIA doesn’t spy on Americans in America. In fact, digital traffic pays no heed to national borders, and the tools and tactics described have almost certainly been made available to, or independently developed by, other US surveillance agencies, not to mention foreign governments and non-government actors.

Bottom line: You should accept the possibility that for the last several years anything you’ve done on, or in the presence of, a device that can connect to the Internet was observed, monitored, and archived as accessible data.

Paranoid? Yes. But the paranoia is justified.

Even if “they”  — the CIA, the NSA, the FBI, some random group of credit card thieves or voyeurs or whatever — aren’t out to get you in particular, they consider your personal privacy a technical obstacle to overcome, not a value to respect.

All the Skeletons

If you’ve got nothing to hide you’ve got nothing to fear? Everyone has something to hide. Somewhere, sometime, you’ve said or done something you regret or wouldn’t want the world to know. And you probably said or did it within a few feet of your smartphone, your laptop, or your Internet-connected television. Maybe nobody was listening or watching. Or maybe someone was. The only plausible conclusion from the Vault 7 disclosures is that you should assume the latter.

Vault 7 confirms that as a State entity, the CIA answers to philosopher Anthony de Jasay’s description of the State as such. Just as a firm acts to maximize profits, the State and its arms act to maximize their own discretionary power. Even if it doesn’t do some particular thing, it requires the option, the ability to do that thing. It seeks omnipotence.

The abuses of our privacy implied by the WikiLeaks dump aren’t an aberration. They’re the norm. They’re what government does.

Reprinted from Libertarian Institute.

Thomas Knapp
Thomas L. Knapp, aka KN@PPSTER, is Director and Senior News Analyst at the William Lloyd Garrison Center for Libertarian Advocacy Journalism and publisher of Rational Review News Digest. He lives and works in north central Florida.

This article was originally published on Read the original article.

Leftists Understand Economics When it Suits Them

Leftists Understand Economics When it Suits Them

Leftists Understand Economics When it Suits Them

What’s the right way to define good tax policy? There are several possible answers to that question, including the all-important observation that the goal should be to only collect the amount of revenue needed to finance the legitimate functions of government and not one penny above that amount.

But what if we want a more targeted definition? A simple principle to shape our understanding of tax policy?

I’m partial to what I wrote last year.
the essential insight of supply-side economics…when you tax something, you get less of it.
I’m not claiming this is my idea, by the way. It’s been around for a long time.

Indeed, it’s rumored that Reagan shared a version of this wisdom.

I don’t know if the Gipper actually said those exact words, but his grasp of tax policy was very impressive. And the changes he made led to very good results, even if folks on the left still refuse to believe the IRS data showing that Reagan’s lower tax rates on the rich generated more revenue.

In any event, our friends on the nanny-state left actually understand this principle when it suits their purposes. They propose sugar taxes, soda taxes, carbon taxes, housing taxes, tanning taxes, tobacco taxes, and even “adult entertainment” taxes with the explicit goal of using the tax code to reduce the consumption of things they don’t like.

I don’t like the idea of government trying to dictate what people do with their own money, but these so-called sin taxes generally are successful because supply-siders are right about taxes impacting incentives.

The Belarusian Idleness Tax

But that doesn’t mean it’s always popular when statist governments impose such policies. At least not in Belarus, according to a story from RFERL.
Protests over a new tax aimed at reducing social welfare spread beyond the Belarusian capital, as thousands took to the streets in Homel and other towns. Along with similar protests two days earlier in Minsk, the February 19 demonstrations were some of the largest in the country in years. In Homel, near the border with Russia, at least 1,000 people marched and chanted slogans against the measure, known as the “Law Against Social Parasites.”
But what are “social parasites” and what does the law do?
…the law…requires people who were employed fewer than 183 days in a calendar year to pay a tax of about $200. …The measure is aimed at combating what President Alyaksandr Lukashenka has called “social parasitism.”
For what it’s worth, the Washington Post reports that the government had to back down.
The protesters won. On Thursday, Lukashenko announced that he won’t enforce the measure this year, though he’s not scrapping it. “We will not collect this money for 2016 from those who were meant to pay it,” he told the state news agency Belta. Those who have already paid will get a rebate if they get a job this year. The law, signed into effect in 2015, is reminiscent of Soviet-era crackdowns against the jobless, who undermined the state’s portrayal of a “workers’ paradise.”
That’s good news.

If people can somehow survive without working (assuming they’re not mooching off taxpayers, which is something that should be discouraged), more power to them. It’s not the life I would want, but it’s not the role of government to tax them if they don’t work. Or if they simply choose to work 182 days per year.

Mr. Lukashenko should concentrate instead on taking the heavy foot of government off the neck of his people. According to the most-recent Index of Economic Freedom, Belarus is only ranked #104, with especially weak scores for “rule of law” and “open markets.”

Given the low freedom ranking for Belarus, I suspect the real parasites in that country (just like in the U.S.) are the various interest groups that are feeding from the government trough.

If Mr. Lukashenko turned his country into a Slavic version of Hong Kong with free markets and small government, people will be clamoring to work. But I’m not holding my breath expecting that to happen.

P.S. While government shouldn’t tax people for not working, it’s also a bad idea to subsidize them for not working. Indeed, there’s even a version of the Laffer Curve for poverty and redistribution.

P.P.S. On an amusing note, here’s the satirical British video on killing the poor instead of taxing them.

Republished from International Liberty.

Daniel J. Mitchell
Daniel J. Mitchell is a senior fellow at the Cato Institute who specializes in fiscal policy, particularly tax reform, international tax competition, and the economic burden of government spending. He also serves on the editorial board of the Cayman Financial Review.

This article was originally published on Read the original article.

Friday, March 17, 2017

D.C. Circuit Court Issues Dangerous Decision for Cybersecurity: Ethiopia is Free to Spy on Americans in Their Own Homes

D.C. Circuit Court Issues Dangerous Decision for Cybersecurity: Ethiopia is Free to Spy on Americans in Their Own Homes

The United States Court of Appeals for the District of Columbia Circuit today held that foreign governments are free to spy on, injure, or even kill Americans in their own homes--so long as they do so by remote control. The decision comes in a case called Kidane v. Ethiopia, which we filed in February 2014.

Our client, who goes by the pseudonym Mr. Kidane, is a U.S. citizen who was born in Ethiopia and has lived here for over 30 years. In 2012 through 2013, his family home computer was attacked by malware that captured and then sent his every keystroke and Skype call to a server controlled by the Ethiopian government, likely in response to his political activity in favor of democratic reforms in Ethiopia. In a stunningly dangerous decision today, the D.C. Circuit ruled that Mr. Kidane had no legal remedy against Ethiopia for this attack, despite the fact that he was wiretapped at home in Maryland. The court held that, because the Ethiopian government hatched its plan in Ethiopia and its agents launched the attack that occurred in Maryland from outside the U.S., a law called the Foreign Sovereign Immunities Act (FSIA) prevented U.S. courts from even hearing the case.

The decision is extremely dangerous for cybersecurity. Under it, you have no recourse under law if a foreign government that hacks into your car and drives it off the road, targets you for a drone strike, or even sends a virus to your pacemaker, as long as the government planned the attack on foreign soil. It flies in the face of the idea that Americans should always be safe in their homes, and that safety should continue even if they speak out against foreign government activity abroad.  

Factual background

Mr. Kidane discovered traces of state-sponsored malware called FinSpy, a sophisticated spyware product which its maker claims is sold exclusively to governments and law enforcement, on his laptop at his home in suburban Maryland. A forensic examination of his computer showed that the Ethiopian government had been recording Mr. Kidane’s Skype calls, as well as monitoring his (and his family’s) web and email usage. The spyware was launched when Kidane opened an attachment in an email. The spying began at his home in Maryland.

The spyware then reported everything it captured back to a command and control server in Ethiopia, owned and controlled by the Ethiopian government. The infection was active from October 2012 through March 2013, and was stopped just days after researchers at the University of Toronto’s Citizen Lab released a report exposing Ethiopia's use of FinSpy. The report specifically referenced the very IP address of the Ethiopian government server responsible for the command and control of the spyware on Mr. Kidane’s laptop.

We strenuously disagree with the D.C. Circuit’s opinion in this case. Foreign governments should not be immune from suit for injuring Americans in their own homes and Americans should be as safe from remote controlled, malware, or robot attacks as they are from human agents. The FSIA does not require the courts to close their doors to Americans who are attacked, and the court’s strained reading of the law is just wrong. Worse still, according to the court, so long as the foreign government formed even the smallest bit of its tortious intent abroad, it’s immune from suit. We are evaluating our options for challenging this ruling.

Source: D.C. Circuit Court Issues Dangerous Decision for Cybersecurity: Ethiopia is Free to Spy on Americans in Their Own Homes | Electronic Frontier Foundation

Andrew Jackson Is a Poor Presidential Role Model

Andrew Jackson Is a Poor Presidential Role Model

Andrew Jackson Is a Poor Presidential Role Model

Donald Trump added a portrait of Andrew Jackson to the White House Oval Office shortly after his inauguration. Why Jackson?

Well, Jackson’s defeat of incumbent John Quincy Adams in the 1828 election was the first great US political upset in which an anti-establishment candidate defeated an insider. This comparison no doubt pleases the man who kept Hillary Clinton from the White House.

Like Trump, Jackson also styled himself as a champion of the “common man,” and that’s a distinction that somehow follows him to this day. But does Jackson deserve to be remembered so fondly as the one who put power in the hands of the people? Let’s break down some of his greatest hits.

  • Egalitarian Reforms. The Jacksonian Era was typified by a reforming zeal, including movements for the abolition of slavery and the rights of women. While these movements might have used egalitarian Jacksonian rhetoric, they had little to do with the real Andrew Jackson, who both owned slaves and subscribed to an already outdated cult of masculinity preoccupied with, among other things, defending public female virtue. (The man loved a good duel.)
  • American Indian Removal. Jackson was the architect of the compulsory removal of Native Americans from their legal homes. This was a national plan for ethnic cleansing, coupled with the forcible redistribution of property from its rightful owners.
  • Checks and Balances. Jackson’s Indian removal policy also ignored the system of checks and balances inherent in the federal system, directly defying the Supreme Court’s ruling in the 1831 case Cherokee Nation v. State of Georgia. Jackson did, however, stop short of calling Chief Justice John Marshall a “so-called judge.”
  • The National Bank. The federal government had overreached its powers in creating the Second National Bank, and Jackson killed that institution. His dedication to defeating the bank, however, was driven by personal animosity more than sound intellectual foundations. (It wasn’t enough for Jackson’s enemies to lose; they had to be destroyed, and he used his power as president to wreak that destruction.) Jackson overstepped his own constitutional authority in his attack, and fighting one wrong with another is hardly great policy.
  • State Nullification. Similarly, Jackson’s action against state nullification, when South Carolina sought to invalidate the federal Tariffs of 1828 and 1832 on the grounds that they were unconstitutional, seems to have been less about principle than about his personal split with his former vice president, John C. Calhoun. Jackson’s stance against states’ rights aided the evolution of a powerful national government based on vigorous military suppression, a trademark of the Jacksonian presidency.
  • Spoils System. Jackson’s embrace of the spoils system, rewarding his supporters (or cronies) with political positions, further concentrated and entrenched his and later presidents’ executive power.
What is the takeaway here? Rather than decentralizing power or returning it to the people, Jackson magnified his own. As a matter of fact, he claimed that he embodied the people in the same way that Louis XIV believed that he was France, earning Jackson the title “King Andrew I” from his opponents.

In short, for those who support liberty, Trump has chosen a troubling model for his presidency.

This makes Thomas Jefferson’s words to Daniel Webster in 1824 all the more important to remember:
I feel much alarmed at the prospect of seeing General Jackson President. He is one of the most unfit men I know of for such a place. He has had very little respect for laws and constitutions, and is, in fact, an able military chief.  His passions are terrible. When I was President of the Senate, he was Senator; and he could never speak on account of the rashness of his feelings. I have seen him attempt it repeatedly, and as often choke with rage. His passions are, no doubt, cooler now; he has been much tried since I knew him, but he is a dangerous man.

Republished from Learn Liberty.

Amy Sturgis
Amy H. Sturgis earned her Ph.D. in intellectual history from Vanderbilt University, specializes in Science Fiction/Fantasy and Native American Studies, and teaches at Lenoir-Rhyne University.

This article was originally published on Read the original article.

Thursday, March 16, 2017

MegaCon 2002: Eugene Roddenberry Presents

MegaCon 2002: Eugene Roddenberry Presents

From the program:

“Join the son of critically acclaimed Gene Roddenberry as he shows and discusses Star Trek Bloopers.”

There was lots of Star Trek discussion but I don’t believe there were any bloopers. Technical difficulties or something…

Rescue on Fractalus! (Atari XE) – Strategy

Rescue on Fractalus! (Atari XE) – Strategy

Wednesday, March 15, 2017

Trillions in Debt and We’re Just Scratching the Surface

Trillions in Debt and We’re Just Scratching the Surface

Trillions in Debt and We're Just Scratching the Surface

As the federal debt has gone from astounding to unbelievable to incomprehensible, a new problem has emerged: The US government is actually running out of places to borrow.

How Many Zeros Are in a Trillion?

The $20 trillion debt is already twice the annual revenues collected by all the world’s governments combined. Counting unfunded liabilities, which include promised Social Security, Medicare, and government pension payments that Washington will not have the money to pay, the federal government actually owes somewhere between $100 trillion and $200 trillion. The numbers are so ridiculously large that even the uncertainty in the figures exceeds the annual economic output of the entire planet.

Since 2000, the federal debt has grown at an average annual rate of 8.2%, doubling from $10 trillion to $20 trillion in the past eight years alone. Who loaned the government this money? Four groups: foreigners, Americans, the Federal Reserve, and government trust funds. But over the past decade, three of these groups have cut back significantly on their lending.

Foreign investors have slowed the growth in their lending from over 20% per year in the early 2000s to less than 3% per year today. Excluding the Great Recession years, American investors have been cutting back on how much they lend the federal government by an average of 2% each year.

Social Security, though, presents an even bigger problem. The federal government borrowed all the Social Security surpluses of the past 80 years. But starting this year, and continuing either forever or until Congress overhauls the program (which may be the same thing), Social Security will only generate deficits. Not only is the government no longer able to borrow from Social Security, it will have to start paying back what it owes - assuming the government plans on making good on its obligations.

With federal borrowing growing at more than 6% per year, with foreign and American investors becoming more reluctant to lend, and with the Social Security trust fund drying up, the Fed is the only game left in town. Since 2001, the Fed has increased its lending to the federal government by over 11% each year, on average. Expect that trend to continue.

Inflation to Make You Cry

For decades, often in word but always in deed, politicians have told voters that government debt didn’t matter. We, and many economists, disagree. Yet even if the politicians were right, the absence of available creditors would be an insurmountable problem—were it not for the Federal Reserve. But when the Federal Reserve acts as the lender of last resort, unpleasant realities follow. Because, as everyone should be keenly aware, the Fed simply prints the money it loans.

A Fed loan devalues every dollar already in circulation, from those in people’s savings accounts to those in their pockets. The result is inflation, which is, in essence, a tax on frugal savers to fund a spendthrift government.

Since the end of World War II, inflation in the US has averaged less than 4% per year. When the Fed starts printing money in earnest because the government can’t obtain loans elsewhere, inflation will rise dramatically. How far is difficult to say, but we have some recent examples of countries that tried to finance runaway government spending by printing money.

From 1975 to 1990, the Greek people suffered 15% annual inflation as their government printed money to finance stimulus spending. Following the breakup of the Soviet Union in the 1990s, Russia printed money to keep its government running. The result was five years over which inflation averaged 750%. Today, Venezuela’s government prints money to pay its bills, causing 200% inflation which the International Monetary Fund expects to skyrocket to 1,600% this year.

For nearly a century, politicians have treated deficit spending as a magic wand. In a recession? We need jobs, so government must spend more money! In an expansion? There’s more tax revenue, so government can spend more money! Always and everywhere, politicians argued only about how much to increase spending, never whether to increase spending. A century of this has left us with a debt so large that it dwarfs the annual economic output of the planet. And now we are coming to the point at which there will be no one left from whom to borrow. When creditors finally disappear completely, all that will remain is a reckoning.

This article first appeared in InsideSources.

Antony Davies
Antony Davies is an associate professor of economics at Duquesne University in Pittsburg.

He is a member of the FEE Faculty Network.

This article was originally published on Read the original article.

Tuesday, March 14, 2017

Queen Victoria’s Modernity

Queen Victoria’s Modernity

Queen Victoria's Modernity

I decide what is the future,” 21-year old Queen Victoria angrily informs her husband, Prince Albert, in the sixth episode of Masterpiece Theater’s Victoria. It sounds like a pretentious, condescending, and just plain wrong thing to say, but she had no idea how right she was.

We don’t usually relate her reign with progressive trendsetting. Today we use the word “Victorian” to identify overly fussy wallpaper or overly scrupulous etiquette and morals that cry out for relaxation. But think of when she ruled: she reigned from 1837 to 1901, a period of astonishing economic, technical, industrial, and cultural transformation for the entire Western world. Her influence became a guiding voice for progress, not a reaction.

So perhaps it’s time to revisit the person who has an entire era named after her. This show is just the thing to help.

The first season of Victoria only covers the first two years of her reign, but it’s enough to establish the fact that Queen Victoria did arguably more for the United Kingdom and the British monarchy than her three or four immediate predecessors, or her successors. She certainly did more for women’s suffrage, even more so than Queen Elizabeth II.

So many traditions we now accept as doctrine were actually begun by Victoria a mere 150 years ago – a fraction of the time the British monarchy has existed, and far more recently than I, at least, assumed would be the case.

Here are some traditions we can credit to Queen Victoria, who no doubt had no idea just how extensive her influence over the future was, as portrayed in Victoria.

Buckingham House

We associate the royal household with Buckingham Palace now, but when Victoria became queen it was simply “Buckingham House,” and the royal family had its official residence at Windsor Castle. The original structure of Buckingham had been built by 1703 for the Duke of Buckingham and bought (probably just taken) by George III in 1761 for his wife, Queen Charlotte. Over the following decades, it was built up until it became the huge building Queen Victoria decided should be renamed as a palace, rather than a house. She took up residence there so she and her court could spread out – she wanted to be further from her manipulative mother and those who influenced her – and to get away from the noise of the city of London.

If you’ve watched The Crown, you’ll remember when Queen Elizabeth and Prince Philip were hoping to stay at their home in Clarence House, which is still used as the Heir Apparent’s place of residence. Elizabeth was strongly opposed by her secretary, who informed her that it was “tradition” to live at Buckingham Palace. Perhaps if Elizabeth had known that this tradition was only three generations old and that nearly every previous monarch had changed their place of residence, she could have won that argument.

Let There Be Light?

When Victoria took the throne, gas lighting was just starting to become a mainstream thing. Her Head of Household and former governess, Louise Lehzen, decided she wanted to bring “economy” and “modernity” to the royal household, and chose to begin with the lighting. Buckingham Palace was, of course, lit entirely by candlelight. To give some idea of what that would’ve been like: filming Victoria required 12,000 candles. They only filmed eight episodes, and they were only lighting a few rooms at a time, depending on what scenes they were filming. So it’s easy to see why Lehzen wanted to replace the candles with the latest gas technology.

It was a great idea, but the technology was still too new to catch on. Installing it meant taking apart the walls, revealing an unknown rat infestation that was as horrifying to watch as you’re imagining. As Victoria’s steward Penge said, “This is what happens when you interfere with nature.” Obviously the answer is to get rid of all the rats, instead of carrying on as before and pretending it’s all fine, but for some reason, his argument convinced a lot of people. Apparently, it simply wasn’t natural to have gas light.

However, the gas lines were installed in a few rooms in an attempt to make it work. But employees in the palace didn’t know how to work it, and they burned their hands trying to light the gas. In the end, Lehzen decided that in this case at least the old way is the good way, and Penge is left to smugly order new candles.

Married in White

It’s highly unusual in Western culture for a woman’s wedding dress to not be white now. It’s said to be highly traditional, highly symbolic, and special because it’s difficult to keep clean (or was, before washing machines and drop-off dry cleaning became widely available). But it’s really just a fad that began with Queen Victoria and never stopped.

Before Victoria’s wedding, brides simply wore the nicest dress they had. Fabric was very expensive before the Industrial Revolution, and most women couldn’t afford a new dress for their wedding day, let alone a fancy one. In fact, white had historically been reserved for the opposite occasion – mourning.

Royal wedding clothes were the robes of State, heavy and cumbersome. Victoria didn’t want that and decided she wanted to get married in a white dress, because she wanted to. Her decision sparked a trend, and 150 years later, we’re still carrying it on.

The trend became even more historical after the 1960s. Before that, wedding dresses were white, but cut in whatever dress style was popular at the time. After the 1960s, wedding dresses became longer, fuller, and fancier. In a word, they became more Victorian.

The Transportation of the Future

Throughout the sixth episode, Victoria and Albert fought about whether or not locomotives should be allowed in England. In the scene where Victoria declared her ownership of the future, Albert had just returned from a clandestine visit to her political enemy Robert Peele to see the “locomotive” Peele had running on his property. Victoria had been against locomotives from the beginning, citing the inevitable disruption to the land that the railways would have to cross, and she was furious that he would risk his life riding the machine (which would both leave her alone and possibly leave the throne open to her unpopular uncle in the case of her dying in childbirth), and attempt to support something she doesn’t want.


However, after taking a couple days to think about it all, Victoria decided it was only fair to see and test the thing for herself. So she went off, very very pregnant, to Peele’s. Locomotives were still very dangerous, and childbirth was still very high-risk, so Peele, his train engineer, and Victoria’s ladies-in-waiting were all extremely concerned for Her Majesty and the heir she was carrying. But she waved them all aside, even though she was clearly very scared herself, climbed on board, and instructed the engineer to start the train.

At first she was terrified to be moving so rapidly, but Victoria’s fear quickly turned to exhilaration as she waved at anyone she passed and marveled at the machine’s ability. Albert, meanwhile, heard his wife had gone off to try the locomotive, and ended up running beside her train carriage for a bit (locomotives weren’t nearly as fast then as they are now, and Peele’s small, private train would’ve been even slower). He called up to her, asking what she thinks, and she yelled back, “This is the future!” Needless to say, locomotives were brought to England and used throughout the country.

The Modern Kitchen

While Victoria and Albert were deciding the future of transportation in Great Britain, there was an arguably more important side story being carried out in the kitchens of Buckingham Palace: the invention of the “bombe Victoria,” a kind of Baked Alaska. Victoria’s Chief Chef and Maître d’Hôtel, Charles Elmé Francatelli, was an artist of a chef, and he knew it. One of the first celebrity chefs, he wrote several cookbooks filled with his own inventions, was known for his elaborate sugar work and confections, and was generally wasted on the English queen and her German husband, who preferred plain food to his couture French cuisine.

Francatelli ended up leaving the royal household after only two years, and went on to work for private aristocratic clubs, marketing himself as the Maîre d’Hôtel to Her Majesty the Queen, continuing to act as the creative genius behind the food trends of the day, transforming food from mere sustenance to an experience, making high-quality foods accessible, and bringing flavorful recipes previously kept for the rich to the middle classes via his cookbooks.

Royal Births

There was a practice in England (and other countries) to have a crowd witnessing royal births: midwives, ladies-in-waiting, and members of court – including the men. The idea behind it was to make sure nothing untoward was done to the mother or the baby, and to ensure that if the baby was born stillborn, it wouldn’t secretly be replaced with another, live baby to falsely continue the royal line. Victoria didn’t know about this practice until she herself was in labor and looked up to see a couple dozen men standing around talking and joking and watching her.

“What are all those men doing there?” she asked Albert. “Apparently it is the custom, in case there is a substitution,” he responded somewhat apologetically. Victoria was having none of it, of course, and immediately said, “Tell them all to go away!”

Victoria was undoubtedly not the first English royal woman to wish the crowd would leave, but as the first Queen of England to ever give birth, Victoria finally had the authority to do away with the custom. She did allow the Home Secretary to stay, and this became the new custom until Elizabeth II was due to give birth to Prince Charles. After his birth at Buckingham Palace, royal babies were born at St. Mary’s Paddington hospital. Much improved over the old way.

Just Beginning

There are many other examples of Victoria determining the future as well: postage stamps were first introduced in England with her approval, and she made her country realize that a Queen could be monarch, wife, and mother all at once. As her rule extended into the late 19th century and even into the 20th, she started dozens more trends and introduced hundreds of things. But since Season Two of Victoria isn’t out yet, those will have to wait.

Eileen L. Wittig
Eileen Wittig is an Associate Editor and author of the Lazy Millennial column at FEE. You can follow the Lazy Millennial on Twitter.

This article was originally published on Read the original article.

Why We’re Being Watched

Why We’re Being Watched

Why We're Being Watched

Wikileaks has just published over 8,000 files they say were leaked from the CIA, explaining how the CIA developed the capacity to spy on you through your phone, your computer, and even your television. And Wikileaks’s Julian Assange claims these “Vault 7” documents are just one percent of all the CIA documents they have.

The media will be combing through these for weeks or months, so now is a perfect moment for us to reconsider the role of privacy, transparency, and limited government in a free society.

We’ve put together a quick list of the six best Learn Liberty resources on government spying and whistleblowing to help inform this discussion.

1. War Is Why We’re Being Watched

Why is the US government spying on its citizens in the first place? Professor Abby Hall Blanco says that expansive state snooping at home is actually the result of America’s military interventionism abroad:

2. Is Privacy the Price of Security?

Yes, you may think, the government is snooping on us, but it’s doing that to keep us safe!

That’s the most common justification for sweeping and intrusive surveillance, so we held a debate between two experts to get right to the heart of it. Moderated by TK Coleman, this debate between Professor Ronald Sievert and Cindy Cohn, the Executive Director of the Electronic Frontier Foundation, was inspired in part by the revelations about NSA surveillance leaked by Edward Snowden in June 2013.

3. Freedom Requires Whistleblowers

People are already drawing parallels between the Snowden leaks and the Vault 7 revelations. If the leaks are indeed coming from a Snowden-like whistleblower, that will once again raise the issue of government prosecution of people who reveal classified information to the public.

Professor James Otteson argues that a free society requires a transparent government, and whistleblowers play a key role in creating that accountability. Otteson also sounds a warning that should resonate with many Americans today:
Maybe you’re not concerned about the invasions of privacy that the federal government agencies are engaging in because you think, “Well, I haven’t done anything wrong. What do I have to fear?” Maybe you think, “I like and support this president. I voted for him.”

But what about the next president?  The powers that we let the government have under one president are the same powers that the next president will have too.

What if the next president is one you don’t support? He, too, will have all the power that you were willing to give the president you now support."

4. Encryption Is a Human Rights Issue

Documents from Vault 7 suggest that the CIA has been so stymied by encrypted-messaging apps, such as Signal and Whatsapp, that it has resorted to taking over entire smartphones to read messages before they are sent.

That turns out to be a costly, targeted, and time-consuming business that doesn’t allow for mass data collection. But for decades, government officials have tried to require tech companies to give the government a backdoor into their encryption. In “Encryption Is a Human Rights Issue,” Amul Kalia argues that protecting encryption from government is essential to our safety and freedom.

5. The Police Know Where You Live

It turns out that it’s not just spy agencies that have access to detailed information about your life. Ordinary police officers have it, too, and they often face little supervision or accountability. As Cassie Whalen explains, “Across the United States, police officers abuse their access to confidential databases to look up information on neighbors, love interests, politicians, and others who had no connection to a criminal investigation.”

Surveillance is a serious issue at every level of government.

6. Understanding NSA Surveillance

If you’re ready to take your learning to the next level, check out our complete video course on mass government surveillance with Professor Elizabeth Foley. In it, you’ll learn what you need to know to make sense of the NSA scandal in particular and mass surveillance in general.

Reprinted from Learn Liberty.

Kelly Wright
Kelly Wright is an Online Programs Coordinator at the Institute for Humane Studies.

This article was originally published on Read the original article.

Friday, March 10, 2017

Color Computer Magazine (November 1983)

Color Computer Magazine (November 1983)

Thursday, March 9, 2017

Trump: Privacy for Me, Not for Thee

Trump: Privacy for Me, Not for Thee

Trump: Privacy for Me, Not for Thee

President Donald Trump has a consistency problem on the issues of government surveillance and privacy. For the most part, Trump seems to make his ideological decisions based on how something impacts his life personally.

This has been made overwhelmingly apparent watching Trump spend the entire weekend condemning former President Obama for allegedly wiretapping the phones at Trump hotels during the 2016 presidential campaign.

While addressing the press and calling for a full congressional inquiry, President Trump referred to these actions as Obama’s “greatest abuse of power.”

To be sure, if Trump’s accusations about Barack Obama prove to be true, this is absolutely appalling on behalf of the former president. However, is this the greatest abuse of power by the Obama Administration? Hardly.

Hope and Change

The Obama presidency was supposed to usher in a new era of government transparency. The Bush years had left the people traumatized as civil liberties and other constitutional safeguards were disregarded in the name of national security and the war on terror.  

The people wanted change and President Obama was going to be the man who led this country back to freedom.

Unfortunately, that was not the case.

It wasn’t long before journalists had dubbed the Obama Administration one of the least transparent presidencies in modern times. To make matters worse, Obama continued to execute Bush’s destructive foreign policy strategy— he just did so in secret and kept it from the American public—or so he thought.

Thanks to journalist Jeremy Scahill, the world discovered that President Obama had not only continued, but actually escalated the use of drone warfare in the Middle East. In Scahill’s Oscar nominated film, Dirty Wars, it was also revealed that  America’s favorite peaceful president had a secret kill list which he used to go after enemy combatants, including American citizens.

The Edward Snowden NSA leaks came as a further blow to then President Obama, who was elected on a platform that promised to protect government whistleblowers. Of course, this section was conveniently removed from his website shortly after Snowden’s first round of leaks.

Given this information you would think Trump would admire Snowden. After all, Snowden, like Trump, discovered that the government was illegally spying on its citizens, and spoke up. However, that doesn’t seem to be the case.

“Kill the Traitor”

Uttering the phrase “kill the traitor” in 2013, Donald Trump called for the execution of Edward Snowden. During the campaign, Trump claimed that Snowden was working as a spy for Russia.

While similar rhetoric was often thrown around by other Republican presidential candidates, Trump’s continued disapproval of Snowden was particularly perplexing, since it was only thanks to another infamous government whistleblower that Trump was thrown a lifeline when he needed it the most.

Whether he will admit it or not, the leaked Access Hollywood footage dealt a huge blow to the Trump campaign. His offensive comments about women were putty in the hands of Hillary Clinton, who was desperately looking for any way to take him down.

Just when it appeared like the Trump campaign would never recover from his past remarks, Wikileaks changed the game by leaking several of John Podesta’s personal email’s that admitted wrongdoing on the part of the Democratic National Committee.


Trump seized on this moment and shifted the focus back to “crooked Hillary” and the Democratic Party and away from his own Access Hollywood scandal. Publicly declaring his love for Wikileaks, Trump went on to praise Assange for the good work he was doing while simultaneously condemning Snowden for the same actions.

It should also not be forgotten that during the 2016 legal battle between Apple and the Department of Justice, Trump called for a national boycott of Apple until the company agreed to unlock the phone of San Bernardino shooter, Syed Farook.

Addressing a crowd Trump said:
"Apple ought to give the security for that phone, OK. What I think you ought to do is boycott Apple until such a time as they give that security number. How do you like that? I just thought of it. Boycott Apple.”
With these comments, Trump sided with the surveillance state – one that threatens and oppresses not only Americans but every digital user on the planet – rather than with commerce and consumers, who are in desperate need of privacy protection. Apple sought to give that to its customers, while Trump wanted to take it away.

Trump is absolutely right, the allegations that President Obama wiretapped his phones are indeed “very troubling.” However, they are no less troubling than the surveillance state that Trump has also advocated for both in his support of the Department of Justice’s fight against Apple, and his condemnation of Edward Snowden.

Now that Trump thinks he has been a victim of the same spying he has favored on everyone else, he flies into a fury of outrage. He is right now and wrong before.

Brittany Hunter
Brittany Hunter is an associate editor at FEE. Brittany studied political science at Utah Valley University with a minor in Constitutional studies.

This article was originally published on Read the original article.

The Luddites Were Wrong Then and They’re Wrong Now

The Luddites Were Wrong Then and They’re Wrong Now

The Luddites Were Wrong Then and They're Wrong Now

When Joseph Whitworth was growing up in Stockport, the man who became the greatest mechanical engineer of the Victorian age witnessed a traumatic sight. In 1812, this unlovely industrial town on the outskirts of Manchester was overrun by Luddite rioters, all the more terrifying as they were wearing women’s clothes as they went on the rampage, smashing power looms and burning down textiles mills. Many of the Luddites were later hanged, their protest against new technology in vain.

Today, the impact of new technology on jobs and social order is as burning a political and economic question as it was in the 19th century. In recent years, millions of Americans and others around the world have lost their jobs in manufacturing, their disaffection helping to propel Donald Trump to the White House.

There seem to be increasingly few jobs that a computer cannot do better than a mere human being, from flipping burgers to driving a lorry or processing an insurance claim. Whitworth, who lived from 1803 to 1887, was at the heart of a similar Victorian debate.

Less celebrated than Brunel or Stephenson, Whitworth’s impact was arguably more important than these better-known figures. Together with other mechanical engineers such as Henry Maudslay, Richard Roberts, and James Nasmyth, Whitworth pioneered a manufacturing revolution that saw Great Britain transformed from a craft economy to full mechanization in the space of two generations. Without this, the railways could not have come into being, the textiles industry would not have become so dominant, and shipbuilding would not have evolved into a great industry.

When Whitworth started out, the main tools used in the primitive factories of London, Birmingham, and Manchester were hammer and chisel, wielded by hand. The UK’s craftsmen were highly skilled, but standards of accuracy were poor. Mass production was at a rudimentary stage. By the time of the Great Exhibition in 1851, when Whitworth carried off more prizes for engineering excellence than anyone else, all had changed: the UK was the undisputed workshop of the world.

The mid-century was the age of machinery: machines operated to unprecedented levels of precision. Whitworth designed one that could measure down to a millionth of an inch, admired at the Great Exhibition by Prince Albert and Charles Dickens. He pioneered machine tools, the lathes, boring, planing, milling, drilling and slotting machines and so forth that replicated tasks traditionally carried out by hand. These were sold by the ton from his factory in the heart of Manchester, near Piccadilly Station.

His great rival and fellow Manchester industrialist, James Nasmyth, also won a prize for his steam hammer, an archetypal machine tool that was a source of wonderment to contemporaries as it combined power with delicacy: it could bash a giant red-hot girder into shape as well as be brought to rest on top of a wine glass. This technology was adapted to make the pile driver, a machine that transformed Victorian civil engineering and is still in use on building sites today.

Mass production became a reality as did interchangeable components – the notion that you could make parts for an engine or a ship or a railway carriage in different factories and they would all fit together. Remarkably, it wasn’t until Whitworth completed the job in the 1860s, that there were standard measures for nuts and bolts, the most basic manufacturing components. The Whitworth Standard was in place in much of the world until after the Second World War.

Contemporaries viewed mechanization, the 19th-century equivalent of automation, with a mixture of horror and awe. We are familiar with the lurid descriptions of the industrial north in the novels of Charles Dickens or Elizabeth Gaskell, but less so with the wonderment people felt when they saw the machinery at work.

In the mid-twenties, the Manchester engineer Richard Roberts invented the self-acting mule, a machine that more or less completely automated the fiendishly complicated task of spinning yarn. “I have stood for hours admiring the precision with which the self-actor executes its multifarious successions and reversals of movement,” gushed one observer. The machine was dubbed the Iron Man because it seemed to move and think as if it were a human being.

The Iron Man was invented at the request of Manchester mill-owners who were fed up with the power of their workers to hold production to ransom and demand ever higher wages. As today, one of the motivations for the new technology was to cut costs and eliminate the need for troublesome human labor.

James Nasmyth retired from business in the 1850s after a bruising strike, complaining that workers were feckless and failed to turn up for work, while machines “never got drunk, their hands never shook from excess, they were never absent from work, they did not strike for wages [and] they were unfailing in their accuracy and regularity”. With the help of machinery, he reduced the workforce at his Patricroft factory near Manchester by half.

But, echoing today’s debates, there was another perspective. Whitworth celebrated the fact that mass production brought prices down dramatically: the cost of making a surface of cast iron true with hammer, chisel, and file was 12s per square foot, compared to labor costs of less than one penny if a planing machine were used. Likewise, the price of a 29-yard bolt of printed cloth fell from 30s 6d to 3s 9d.

This spectacular reduction in costs brought benefits to society at large, he contended. Staple goods became cheaper, and there would be more leisure time for workers and less need for strenuous manual labor. The technology created new and better jobs for working people, and wages could go up.

Even Nasmyth agreed. “Brute force is set aside, and the eye and the intellect of the workman are called into play,” he said. “All that the mechanic has to do now, and which any boy or lad of 14 or 15 is quite able to do, is to sharpen his tool, place it in the machine in connexion with the work, and set on the self-acting motion, and then nine-tenths of his time is spent in mere superintendence, not in labouring, but in watching the delicate and beautiful operations of the machine.”

By the middle of the 19th century, British engineering had become capital – rather than labor – intensive. Businesses had become larger, and more dependent on expensive equipment and less on an aristocracy of skilled labor. The roots of the UK’s notoriously poor industrial relations were established.

Whitworth himself was sent by the government to examine American manufacturing practices after the Great Exhibition. He found American workers embraced innovation, while the British resisted change and shared some of the destructive tendencies of their Luddite forebears.

A strange and obsessive man, with the looks of a baboon (according to Jane Carlyle), Whitworth had strong humanitarian concerns for his own workforce, installing public baths near his factory, while he was alive giving away a stupendous £100,000 to fund 30 technical scholarships.

In 1874, he converted his business into a limited liability company and became a pioneer of worker democracy, sharing control with 23 senior staff, and encouraging ordinary operatives to invest £25 in shares. When he died childless in 1887, he left £600,000 to fund his favorite causes. This is the equivalent of Bill Gates-style munificence in our own age, and his philanthropy benefits the students of Manchester University to this day, as well as the park and the magnificent gallery that bears his name.

Thomas Carlyle, Charles Dickens, and Karl Marx saw the mechanization pioneered by Whitworth and his peers as dehumanizing and spiritually impoverishing. But for all the poverty and squalor associated with rapid industrialization, the expanding population enjoyed enduring improvements in living standards, and the economy began to grow at an unprecedented rate. In the long run, writes the economist Robert C Allen, the economic growth that got going in the mid-1800s “compounded to the mass prosperity of today.”

The lesson for today is that technological innovation can be extremely painful, but that over the longer term it does not necessarily come at the price of jobs or prosperity: indeed, new technology begets further innovation that creates wealth and employment in entirely unforeseeable ways. This was not appreciated by the frock-wearing Luddites of the early 19th century, nor is it understood by their spiritual heirs two centuries later.

Republished from CapX.

David Waller
David Waller is an author, business consultant and former Financial Times journalist specialising in business and the nineteenth century. He is the author of "Iron Men: How One London Factory Powered the Industrial Revolution and Shaped the Modern World" (Anthem Press).

This article was originally published on Read the original article.

Wednesday, March 8, 2017

Secret Court Orders Aren’t Blank Checks for General Electronic Searches

Secret Court Orders Aren’t Blank Checks for General Electronic Searches

Imagine this: the government, for reasons you don't know, thinks you're a spy. You go on vacation and, while you're away, government agents secretly enter your home, search it, make copies of all your electronic devices, and leave. Those agents then turn those devices upside down, looking through decades worth of your files, photos, and online activity saved on your devices. They don't find any evidence that you're a spy, but they find something else—evidence of another, totally unrelated crime. You're arrested, charged, and ultimately convicted, yet you're never allowed to see what prompted the agents to think you were a spy in the first place.

Sounds like something from dystopian fiction, right? Yet it's exactly what happened to Keith Gartenlaub. In January 2014, the FBI secretly entered Gartenlaub's home while he and his wife were on vacation in China. Agents scoured the home, taking pictures, searching through boxes and books, and—critically—making wholesale copies of his hard drives.

Agents were authorized by the secret Foreign Intelligence Surveillance Court ("FISC") to search for evidence that Gartenlaub was spying for the Chinese government. There’s only one problem with that theory: the government has never publicly produced any evidence to support it. Nevertheless, Gartenlaub now sits in jail. Not for spying, but because the FBI’s forensic search of his hard drives turned up roughly 100 files containing child pornography, buried among thousands of other files, saved on an external hard drive.

Gartenlaub was tried and convicted, and he appealed his conviction to the Ninth Circuit Court of Appeals. EFF (along with our friends at the ACLU) recently filed an amicus brief in support of his appeal.

There are plenty of troubling aspects to Gartenlaub’s prosecution and conviction. For one, and unlike normal criminal prosecutions, neither Gartenlaub nor his lawyers have ever seen the affidavit and order issued by the FISC that authorized the search of his home. There are also legitimate concerns about the sufficiency of the evidence used to convict him.

But we got involved for a different reason: to weigh in on the Fourth Amendment implications of the FBI’s searches of Gartenlaub’s electronic devices. The unusual facts of this case gave us an unusually good opportunity to push for greater Fourth Amendment protections in all searches of electronic devices.

Here’s why: when agents copied and searched Gartenlaub’s devices, they were only authorized to search for national security-related information. But the prosecution that resulted from those searches and seizures had nothing to do with national security at all. So, either the FBI seized information that was outside of the warrant (which the Fourth Amendment prohibits); or it was relying on an exception to the warrant requirement, like “plain view”—an exception that allows law enforcement to seize immediately obvious contraband when the government is in a place to lawfully observe it.

Plain view makes sense in the physical world. If cops are executing a search warrant for a home to search for drugs, they shouldn’t have to ignore the dead body lying in the living room. But the way plain view works in the digital context—especially forensic computer searches—is not at all clear. How far can cops rummage around our computers for the evidence they’re authorized to look for? Does a warrant to search for evidence of drug dealing allow cops to open all the photos stored on our computer? Does an order authorizing a search for national security information let the government rifle through a digital porn collection? And where do we draw the line between a specific search, based on probable cause for specific information stored on a computer—which the Fourth Amendment allows— and a general search for evidence of criminal activity—which the Fourth Amendment prohibits?

Our electronic devices contain decades' worth of personal information about us. And, in many ways, searches of our electronic devices can be more intrusive than searches of our homes: there is information stored on our phones, computers, and hard drives, about our interests, our political thoughts, our sexual orientations, or religious beliefs, that might never have been previously stored in our homes—or, for that matter, anywhere at all. Because of the sensitivity of this data, we need clear restrictions on law enforcement searches of our electronic devices, so that every search doesn't turn into the type of general rummaging the Fourth Amendment was designed to prevent.

In our brief, we argued this case gave the Court a perfect opportunity to set a clear rule. We argued that the FBI’s search of Gartenlaub’s hard drives for evidence of regular, domestic crimes violated the Fourth Amendment, and we urged the Court to adopt a rule that would prohibit the FBI from using evidence that it obtained that was outside the scope of the initial search authorization. This would be a promising first step in limiting law enforcement’s electronic search powers and in protecting our right to privacy in the digital age.

Source: Secret Court Orders Aren't Blank Checks for General Electronic Searches | Electronic Frontier Foundation

Why America Needs “Star Wars”

Why America Needs “Star Wars”

Why America Needs "Star Wars"

My childhood, like those so many others, was the combination of George Lucas, Steven Spielberg, and, of course, John Williams. I would often sit, perhaps unhealthily, for hours at a time in front of a TV watching VHS tapes of dinosaurs eating people, a professor stealing holy artifacts, and lightsabers crashing. I absorbed it all, and my brothers and I practiced it. We would duel with our plastic (the old sturdy ones) lightsabers, hurting each other's’ fingers and feelings. Around high school, I had somewhat of a “nerd-retreat,” a time when being a fan of Star Wars was for some reason “uncool.” College reopened my love for the great saga, and now, I unashamedly utilize the galaxy far, far away in my classroom teaching economics and government.

Every fandom has its gloriously diverse and vast fan-fiction with theories that range from the plausible to conspiracy. Star Wars, through the former Expanded Universe (dubbed “Legends”) and the official Canon, is ripe with opportunity for fans to write, speculate, and imagine. My three younger brothers and I constantly engage in this activity, debating the merits of Emperor Palpatine as the murderer of Padme Amidala and the like. But while all of this is fun and engaging, it seems to lack a certain gravity of importance. I asked this question a few weeks ago: if it lacks importance, why do so many love to think and talk about Star Wars? What brings millions to engage in such an activity?

The Hero’s Journey through Space

In order to understand this question, we need to understand the definition of a key word: mythos. A mythos is a common set of stories that can be used to explain the world, and more often provide a foundation for a cultural morality. A mythos is not the equivalent of religion. Religion tends to provide an explicit and prescriptive morality. A mythos provides more of a cornerstone worldview, a basic layer for others to build upon with morality.

Any discussion of mythos has to include the renowned mythologist Joseph Campbell, whose seminal work The Hero with a Thousand Faces elaborated on what’s called the “hero’s journey.” Essentially, the hero’s journey is a basic story structure where a seemingly boring individual rises to become a hero with the aid of mentors and friends, and must brave great adventures and villains. Campbell identified the hero’s journey across all cultures. The same basic myth-narrative is repeated in nearly all geographies and ethnicities. The hero’s journey is, at heart, the common human story repeated everywhere.

This monomyth, as it is called, has been studied and examined ever since Campbell’s work was published in 1949. It’s changed here and there, with different scholars adding different things, but it remains more or less the same. The monomyth can be clearly seen through the Star Wars saga, but it is especially clear in Episode IV: A New Hope.

Luke, a seemingly unbecoming farm boy, is called to the adventure of saving a captured princess from black-cloaked villain, and initially refuses. With the help of a wizardly mentor and a band of equally unbecoming allies, he becomes entrapped in the belly of the beast (the Death Star), from which they escape with their reward (the Death Star plans), but not before Luke endures the pain of watching his mentor (old Ben Kenobi) die. The final trial, destroying the beast, is Luke’s great transformation from the boy on Tatooine to the next generation of warriors (the Jedi Knights).

This is fascinating stuff, and very exciting. Not only was the 1977 cinematic experience great, but the entire saga’s story is wonderful, however much it may be masked by poor dialogue and acting. Film critics have never been a fan of the movies, even for their stories. They say it’s too easy, made for children, cartoonish, etc. I read that as, “This isn’t morally ambiguous, and therefore, it’s not a good story.” Such nihilism is apparently cool, but I don’t buy it. No, the stories aren’t all that complex, but that’s the point. Remember, a mythos is supposed to aid us in developing an understanding of the world from a certain point of view.

Through stories, we lay a foundation to build an ethical code founded in morality. The purpose is to get us to think about how we act and why we act. If we get bogged down in Inception-class complexity, we lose that powerful purpose.

The story in Star Wars is intentionally simple. The characters aren’t stereotypical, but archetypical, and resonate a certain set of traits we can easily identify. It can sometimes feel like a children’s story, but again, that’s the point. The essence of a mythos should make us wonder in awe, tap into our imagination, bring out our inner child. When I watch these movies, I’m like a giddy boy, relishing in the narrative. When the movie is done, it’s almost as if my inner child, having finished the adventure, returns and consults with my adult on what just happened. That’s mythos: the dialogue between wonder and reason.

The Great American Mythos

The master of this myth-creating process was J.R.R. Tolkien, author of Lord of the Rings and The Hobbit. He saw that England lacked a true mythos, one separated from reality (thus eliminating anything relating to King Arthur and Beowulf). He set out to create an entire universe he could populate with stories; thus was born Middle Earth. His goal wasn’t necessarily to write great stories (which he did), but to provide an epic universe with histories, a genesis (see: The Silmarillion), and languages with dialects. LOTR has a distinctly English feeling to it, one his countrymen could understand. He wrote it so others could think about the world in a certain way. Some of these stories, like The Hobbit, are so relatable they feel like a children’s story.

Now, George Lucas is no Tolkien. He is a controversial creator, one who nearly destroyed his saga. His storylines can be disjointed, almost contradictory, and he can seem self-serving. Nevertheless, his universe is a fountain of myth, and one that resonates with millions of Americans. Why?

America, at its core, is part of Western Civilization, and relies heavily upon its philosophy and religions. As such, it is greatly influenced by the moral idea that there exists objective good and evil. But we are also a multicultural nation, one that has accepted and welcomed many Eastern philosophies as well. As Americans, we are intrigued by the initially exotic beliefs of the East, and we find a certain tranquility in them. We are drawn toward the idea of a Buddhist monk devoid of personal possession and at peace with everything. We see his balance, and we desire it.

Star Wars seeks to establish a clear dynamic between good and evil: the Rebel Alliance and the Galactic Empire, the Jedi and Sith, Luke Skywalker and Darth Vader. There is great conflict between these easily identified sides (I mean, for crying out loud, the lightsaber colors reveal it all). Yet, the theme of balance is always present. Luke literally balances Yoda on his foot upside down while balancing rocks. C’mon. The Chosen One was to bring balance to the Force, a Force which is at once an inanimate energy field made up of microorganisms, and yet also a personal, willing thing. Is this a contradiction? No, it’s the basis of a mythos worldview.

Americans love a good cops and robbers story, the cowboys and Indians conflict. But we also seek a peaceful tranquility of balance. Star Wars gives us that battle, but simultaneously breathes of a harmonious spirituality. Destiny must be fulfilled in this universe, but personal choice never disappears. Americans, historically, believe they have a duty, a manifest destiny, in the world. But liberty is at our heart too, and we can choose to reject this perceived duty. We are drawn to Star Wars because it mimics what we feel in the first place.

Why does this matter? I believe it is very important for great societies to have a basic cultural commonality. For America, we used to be fairly homogenous in our Protestant religion. That’s not the case anymore, and even when it was, there was no unified denomination. Religion doesn’t make the cut to act as our cultural unifier, and politics certainly doesn’t either. We need a myth, one with an acceptable starting point, a Square One. Star Wars is arguably one of the only viable mythos for America. It has such a wide arrangement of characters and messages that different people can gain different things from it. It embraces our Western-Eastern dichotomy, and, quite frankly, it’s pure fun.

We need Star Wars not because it is a masterfully created cinematic experience, has great dialogue, or anything like that. We need it because the story, much like the Force itself, can surround and bind us together.

A version of this article first appeared at the Catholic Beer Club.

Stanton Skerjanec
Stanton Skerjanec is an economics and government instructor at Liberty Common High School, a classic liberal arts charter school in Fort Collins, CO.

This article was originally published on Read the original article.