First, straight-up, formal economic theory is not my strong point. I took this thread as an opportunity to ask questions and try to learn more about it, so when I ask questions, here, they're not meant to be leading or didactic. I hope that has been clear and will remain clear.
Trieste, thank you for your input - and congratulations on getting the competitive internship. Because this discussion involves wages, I was referring specifically to paid, waged internships. I am assuming that the internship you are involved with is not paid hourly as a wage, but is actually paid as a contract - irrelevant of the actual hours worked.
Also, I could be wrong, but based on your description, it appears that these internships are taking place within the university itself? I work at a university, and am very familiar with these types of internships that operate on a yearly contract. Again, I could be mistaken about your situation. From my experience, the majority of paid internships that take place with private businesses go through their standard payroll, and as a result, the worker is generally assigned the lowest minimum hourly wage permitted by law. Many colleges and universities can avoid this loophole because they market their internships as educational and learning opportunities or work studies, whereas private businesses are under a lot more scrutiny since work is still work. Again, I apologize if this was an incorrect assumption about your situation.
Thanks. I was very proud to land it, although the learning experience has been... *grunts* Let's just say I'm not learning as much about a working forensics lab as I had hoped, and leave at that. I call the department where I work, "Where ambition goes to die." However, that aside, the work takes place in a functioning forensic biology lab, rather than on campus. I'm a master's student, and my actual paycheck comes from the university itself. I do know that the university contracts with the lab, but to what extent I'm not sure. And I bow to your knowledge regarding how private businesses handle paid internships, as I have no experience there.
I do know that without significant support from others (i.e. my husband's pay most of all), I would not be able to maintain a domicile without at least
two other roomies - probably three would be more realistic.
Yes, that is partially what I am asserting, but it is not as simple as that. First of all, we need to realize that the concept of an externally-defined 'poverty line' (meaning, by an agency outside of the free market - aka government), is a largely arbitrary demarcation of income. President Obama's stance that one should not be under the national poverty line if one is working a full time job is one of noble intentions - and one that from a purely philosophical perspective, many people would agree with. However, I make the case that elevating the minimum wage to accomplish this will actually serve a counterproductive purpose, and result in more unemployed Americans, rather than impoverished Americans. Before I explain my reasoning further, ask yourself which is a more ideal situation - to be entirely unemployed and reliant on government aid, or to make very minimal income, and use government aid sparingly?
This seems like, for lack of a better term, a false choice, especially in the current economic climate. Since we're touching on my personal situation - and I don't mind doing so with the caveat that I acknowledge I'm not necessarily everyman - I am making minimal income and I am still not using government aid very sparingly. In fact, I would be making more use of government aid if I qualified, because currently I'm having to pay for several things out of my student loans. My student loans are low-interest Department of Education loans, and it makes very little fiscal sense for me as an individual to continue to pay for things like food out of those loans if I qualified for, for instance, food stamps. In the current MassHealth system (a.k.a. RomneyCare), full-time students are exempt from state health insurance subsidies no matter how low their income is, so when my student loans come due, I will then be paying interest on the health insurance costs I had to pay out of student loans.
In short, the only difference between being minimally employed and being completely unemployed, for me, is not how much I make use of government assistance but which way I'm going to make use of government assistance. I still use it fairly heavily, and tbh I would use it more if I could, if it meant I could squirrel some of my wages away for the harder times that I know will come along, when I know that there will be no more aid forthcoming. I would use it for a safety net if I could manage it... and while that's smart for me as an individual, it doesn't seem very efficient as a use for government funds.
The reality is that by elevating the minimum wage, many business owners are responding by cutting jobs - since there is no corresponding increase in revenue to support this sudden 'forced' increase in wages. As a result, many of the current group of Americans who are living "below the poverty line" according to President Obama, will suddenly see lay-offs. As a result, no longer will they be impoverished, they will be flat out unemployed.
The theory with which I'm familiar, and the script that is followed by liberal politicians in general (I identify politically as an independent, but you can't grow up in Massachusetts without picking up some serious liberalese) is that there will be an increase in revenue, because more people will be able to afford more stuff. The logic pans out for me
* Minimum wage increases, and employers will need to pay their employees more.
* Because the wage has just
increased, employers do not yet see a bump in revenues. They lay people off to keep their cost of labor at the same percentage of revenues.
* But, since those who are still employed have more money at their disposal, they spend their money (the middle and lower classes are not necessarily well-known for their large savings, for instance).
* Revenue increases, allowing employers to rehire those laid off, or new hires.
* New hires continue to spend more money due to higher wages, revenue continues to increase, more employees can be hired who then spend more... etc.
I tried looking up historical data as a quick test to this theory, and I was interested to come across this chart
on Fox News. I went with Fox because they have a well-publicized, strong bias against
raising the minimum wage - in fact, the article containing that chart specifically makes points against raising the minimum wage. I don't think this chart is a very good chart, since firstly it doesn't go back very far at all, and secondly it's comparing minimum wage to unemployment during a time period that is notorious for its high unemployment that had little-to-nothing to do with minimum wage and a whole lot to do with economy-go-'splodey. What I did find interesting was the first part of that chart - from '03 to '07, minimum wage remains flat and unemployment sorta zig-zags in a generally downward trend. The zig-zags look
too small to be statistically significant, so I'm willing to call it a downward trend. Hard to call it a correlative relationship, hard to say it isn't correlative. I need more
... which is, unfortunately, hard to come by, at least as far as I can tell. My Googling and Google Scholar-ing turned up with nothing that gave overall unemployment rates as compared to minimum wages over the last 50ish years that I felt I could decipher and interpret appropriately. There was this chart
that compared teenage unemployment to minimum wage - and there doesn't seem to be a strong correlation there, especially when you look at the wage raise in the late 90s that is accompanied by a continued downward trend in unemployment. However, the figure is not well-sourced, and I'm extremely hesitant to base any conclusions off of it.
So, in short, I don't know. I can't support or reject my hypothesis based on what I have.
What we are seeing in this decade is rampant expansion of government welfare programs. 47 million Americans are now completely reliant on food stamps, and that number will continue to increase if the minimum wage continues to increase.
As a pure definition, the "absolute poverty line" is the "threshold below which families or individuals are considered to be lacking the resources to meet the basic needs for healthy living; having insufficient income to provide the food, shelter and clothing needed to preserve health."
It is important to realize that the 'poverty line' is largely a subjective measure arbitrarily created by the Department of Health and Human Services. After all, the international poverty line is only $1 a day, so already we see that the poverty line is largely a subjective measure of lifestyle that is not influenced by free-market forces.
I think I would put forth the idea that a national poverty line in the US is not quite so much subjective as just local. While the international poverty line is much lower, it functionally must be much lower to cover countries where the population lives in conditions that would get a US domicile condemned. I don't know that I would hold up the international poverty line as an example of free market forces, due to the market being interfered with by international governments, wars, etc. My definition of free market might be different than yours - actually, it probably is, come to think of it. I would, as an example, not consider something like a famine to be part of the free market, but an interfering agency that disrupts the market.
It's late and I've run out of replying steam but hopefully I've given some food for thought.