Parallel: Teaching contemporary American literature to undergrads in 2019 was utterly bizarre because they hadn’t lived through 9/11. So much stuff went over their heads. There’s just a disconnect you’re always going to have because of lived experience and cultural changes. It’s your job, especially in a philosophy course, to orient them to differing schools of thought and go “oh, I didn’t think about it that way.” And correct them on Nietzsche, because they’re always fucking wrong about Nietzsche.
People Twitter
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
I've been a College and University prof for the past 6 years. I'm in my young 40s, and I just don't understand most of the people in their 20s. I get that we grew up in really different times, but I wouldn't have thought there would be such a big clash between them and me. I teach about sound and music, and I simply cannot catch the interest of most of them, no matter what I try. To the point were I'm no sure I want to keep doing this. Maybe I'm already too old school for them but I wonder who will want to teach anymore....
I think this is less time-specific, and more just people not being terribly interested in learning.
For example, a professor who specialized in virology was explaining everything about how pathogens spillover between species, using a 2010s ebola outbreak as an example. I was on the edge of my seat the entire time because it was as fascinating as a true horror movie, and yet other students were totally zoned out on Facebook a few rows ahead of me. While the professor was talking about organs dissolving due to the disease and the fecal-oral (and other liquids) route of ebola, which wasn't exactly a dry subject, lol.
Rinse and repeat for courses on macro/micro economics, mirror neurons, psychology classes on kink, even coding classes.
Either I'm fascinated by stuff most people find boring, or a lot of people just hate learning. I'm thinking it's the latter, since this stuff encompassed a wide range of really interesting subjects from profs who were really excited about what they taught.
I miss them a lot, I used to corner various profs and TAs and ask them questions about time fluctuations around black holes, rare succulent growing tips in the plant growth center, and biotechnology. It was fun having access to such vibrant people :)
That is the same sentiment my music teacher had 15 years ago and the same sentiment his music teacher did before that. I don’t think it’s illustrating the times as much as just that teaching is a tough and thankless job and most people aren’t interested in learning
I could get that at the grade school level, but at the university/college level those students are choosing the music classes. To be that disengaged for a course you picked is a bit different than a student who is forced to take a course.
That being said, if the course is a requirement that does change things a bit.
Yeah, I'm not sure I agree with this. I've always said to myself that I didn't want to fall into this old-versus-young rhetoric, but I think the situation is different. The world and technologies are changing faster than our ability to integrate them. The world in which my father lived wasn't that different from his father's, and neither was mine. But young people, born into the digital age, have been the guinea pigs of social media and the gafam ecosystem, which seems to have radically altered their ability to concentrate (even watching a short film is a challenge), as well as their interest in learning. They see school, even higher education, as a constraint rather than an opportunity. I have the impression that they don't see the point of learning when a Google search or ai answers everything, and that retaining things is useless. That's my 2 cents...
I’ll chime in and say that math teachers have said similar things about calculators/graphing calculators for 25+ years. This is most definitely you getting “old”. It’s okay—it happens to all of us.
As far as attention span, that has been an equally common refrain—going back to people complaining that radio has reduced kids attention spans.
Interesting points. I don't think calculators are equivalent to having the sum of humanity's knowledge, AI, and infinite content in you pocket tho. There's a limit to how much fun you can have with a calculator.... The same goes for attention in class. Not so long ago, if the class bored you, you had to wait while scribbling in a notepad. Now you can doom scroll anywhere anytime. These kids have been test subjects for ipad, youtube content and smartphone,I don't blame them, I blame capitalism who made them addicted to social media and their parents who didn't protect them.
I also want to add that I have some great students, invested in their studies and super bright. It's just that a majority of them now seems to be incapable of focusing on anything for more than a few minutes.
It’s just that a majority of them now seems to be incapable of focusing on anything for more than a few minutes.
I teach chemistry at a college and I don't think it's any different than the past; it's just more obvious. When I was in middle school, I would tune out all the time, but I didn't have a smartphone, so I brought shitty fantasy novels to read under the desk. In high-school, I would tune out all the time, but I didn't have a smartphone, so I would just leave or draw band logos. In undergrad, I would tune out all the time, but I didn't have a smartphone, so I doodled or wrote song lyrics in the margins of my notebook. Even in grad school, i would frequently just straight disassociate my way through lectures when I ran out of attention span (so every 5 minutes or so).
There's tons of pedagogy and andragogy research that shows that humans in general only focus for 10-15 minutes at a time (and it's even shorter for teens and males in their early 20's), and that's remarkably consistent across generations. I don't think people actually have shorter attention spans; they just have an easy way to mindlessly fill that void that is harder to come back from without an interruption. Frankly, my students from Gen X all the way to Gen Alpha students do pretty good at paying attention, but even my best students still zone out every few minutes, and that's fine. It's just human nature and the limitations of the way our brains are structured.
Pretty much. I think a lot of the anger over phones is that it makes it real obvious when someone doesn't care what you're saying. You're right that you used to look out into the classroom and couldn't really tell who was focusing or zoned out
As someone who is young but old enough to remember when boredom was a thing let me tell you boredom sucked. There wasn't really anything to it worth keeping. Yeah sometimes I go for a walk and have a think but that's intentional. Being bored when you're stuck in line or something is just painful and has no redeeming qualities
100%. The only redeeming quality of boredom is that it encourages you to go out and gain other interests and skills in the absence of other entertainment, but that's more in the "I'm done with my homework and have nothing to do for the next 2 hours until dinner" sense. And even before smartphones, TV, booze, and weed easily filled that niche if you weren't careful.
I'm pretty sure it's always been the case that most students didn't care, because they're forced to be there. I don't even remember being awake for the majority of precalc because first period is just too early in the day.
I'm not sure that tech is really changing all that fast. In the 1990s a good desktop computer had 40 MB of HDD space and 2 MB of RAM. In the 2000s the hard drives were already 1000x as big, and people had hundreds of MB of RAM. That's a massive amount of change in just a decade. In the early 1990s nobody had heard of the Internet. By the 2000s it was everywhere.
Sure, these days a low-end phone has much higher specs than that. But, has the phone-using experience really changed much in the last decade? Even the last 2? Specs have gotten better, but it hasn't really opened up new ways of using the device. Yes, in some ways things are still moving quickly, but it's always been like that. Some things change rapidly, other things slow down.
I agree that people's ability to concentrate has been affected. The fact that "attention" has been turned into a kind of currency means that people seem to have lost the ability to focus on one thing for an extended period. That's something that's unique to the last 1-2 decades. But, I don't think people's interest in learning has changed. It's just that the traditional way of learning in a classroom is much harder if your attention span is shot. It was never easy, most classes were always boring, but people could get through it because they were still able to focus for extended periods.
School was also always a constraint for most people. People who could go to school for the love of learning rather than as a means to an end were always a lucky minority. If you were really lucky you got a teacher / prof / teaching assistant who could make things interesting. But, in most cases they droned through the required material and you tried to absorb it.
I agree that now that searching the Internet is easier, certain methods of learning / teaching are outdated and haven't been adapted yet. Memorizing facts was always stupid, but at least when it took a while to look it up in a paper encyclopedia you could just vaguely see the value. But, these days it's so obviously absurd -- yet that's still what a lot of teachers focus on. It's not to blame the teachers though. They often don't have the freedom to change the way they teach, especially today now that there are so many standardized tests. But, memorizing facts about history, for example, is just ridiculous in a world where looking up those facts even with a vague search like "french guy who tried to attack moscow" will take you right to Napoleon.
Some of the most useful classes I ever had were the ones that taught me to analyze and understand information. For example, a philosophy class on analyzing arguments and identifying logical fallacies has been incredibly useful, and only more useful in an age of misinformation and disinformation. Then there were engineering courses that taught how to estimate. Science courses that taught significant figures and error analysis is extremely important when you have calculators / programs that can spit out an answer to dozens of decimal places when the values you supply are approximate. These sorts of things are incredibly useful in a world where a magic machine can spit out an answer and you need to think about whether that answer is reasonable or not.
Looking at music, there's so much that I've learned outside of school that I never learned in school. I stopped taking music classes at the end of high school, and wasn't all that interested in music for a while. But, since then I've become more interested. And, there's so much that's not easy to learn just using the Internet. Like, trying to understand the circle of fifths, or the various musical modes, or how to spot certain pop/rock songs as using various 8 or 12 bar blues patterns. I'm lucky because I have a friend who has a PhD in musicology who is willing to chat with me about things I find interesting and want to know more about.
Anyhow, my main points is that I don't think that kids today are really any different from any other kids throughout history with two main exceptions: their attention span and the immediacy of information on the Internet. Concentrating in school has always been extremely hard, but at least when I was young I hadn't been trained from age 3 to doom scroll. That means that staying focused through a 1 hour class, which was a chore for me, is a near impossibility for a kid weaned on a smartphone or tablet. As for memorizing, even when I was young, memorizing facts seemed like a waste of time. But, these days it's clearly ridiculous, but the approach to education hasn't fully adapted yet. Really, kids in elementary school should be learning how to fact check, how to cross-verify, how to identify misinformation, etc. But, even if teachers know that, they're boxed in.
Best of luck to you though, it's good that at least you want to jam information into some brains.
For me personally, life stress and exhaustion are bigger focus inhibitors. I agree that school is largely obsolete and I don't really blame kids for checking out
This is basically how teaching secular ethics always is, though. Doesn't seem special about 2025. People will always be overconfident in their beliefs, but it's not necessarily a coincidence or even hypocrisy that they can hold both views at the same time.
You can believe that morality is a social construct while simultaneously advocating for society to construct better morals. Morality can be relative and opposing views on morality can still be perceived as monstrous relative to the audience's morality.
Can both points not be true? There will be local morals and social morals that differ from place to place with overarching morals that tend to be everywhere.
Not all morals or beliefs have to be unshakable or viewed as morally reprehensible for disagreement.
Unless they mean all their ethics are held that way in which case that's just the whole asshole in a different deck chair joke.
I'm sure both are true for some people, but I think the irony he's pointing out is that this belief system recognizes that every individual/culture has different morals, while simultaneously treating individual/cultural differences as reprehensible.
Sounds like someone who was raised in an echo chamber. They recognize other chambers exist, but hate that they do. We're back to tribalism.
Or someone with strong morals? I think LGBT people deserve to live. I understand that other people do not based on their own moral arguments. I would not want to associate with them. I don't live in an echo chamber. I recognize and interact with people with different beliefs (even on LGBT issues), but there are certain moral beliefs that make me not desire to interact with people. Is that tribalism or my morality? If I don't wanna hang out with nazis, I guess that's tribalism and the outgroup is nazis? Should I stop living in an echo chamber and hang out with more nazis?
The concept of an echo chamber when used in this casual way is so reductive. "People hang out with other who and consume media that aligns with their beliefs". That's not inherently a bad thing. It becomes bad when they are unable to recognize other beliefs exist and unable to accept at least some of them as valid alternative perspectives.
The context is important - “morals” covers both “I think drinking is/isn’t an inherently morally irresponsible activity” and “I want to gas minorities”, and one of those has slightly higher stakes. You can understand the latter often happens because small town america might not have ever met minority groups, or somehow figures the small immigrant community with delicious food is “one of the few good ones” - that doesn’t make their “morals” any less reprehensible.
If you agree that morals are relative and culturally constructed, then you shouldn't reject differences in morals of others as immoral.
That's basically just taking a position where you want to be able to change your mind on what's "moral", and expect everyone else to follow your opinion on it.
I don't think acknowledging morals as relative to the culture they exist within exempts decrees of immorality. Relative to their culture, it is. Should they speak from the point of view of a culture that they don't understand? I personally think it's a sliding scale where, to the extent it harms other people, it needs to be viewed more objectively just, and where it doesn't harm, it's fine being a difference in opinion. The only downside to this is that sometimes you don't know enough about a topic to know there are victims, and so your prescriptive thoughts can change very quickly about the morality of it. Perspective is important and should always be maximized to avoid this problem.
The misunderstanding I see here is in the definition of “subjective”.
Subjective is often used interchangeably with opinion. And people can certainly have different opinions.
But the subjective that is meant is that morals don’t exist without a subject, aka a mind to comprehend them.
A rock exists whether or not a mind perceives the rock. The rock is objective. It is a physical object.
The idea that it is wrong to harm someone for being different is subjective. It is an idea. A thought. The thought does not exist without a mind.
So yes. Morals are all subjective. Morals do not exist in the physical world. Morals are not objects, they do not objectively exist. They exist within a subject. Morals subjectively exist.
That does not mean that any set of morals is okay because it’s just an opinion, bro. Because it’s not just an opinion. Those subjective values effect objective reality.
I think this is a bit too simple. Suppose I say that moral badness, the property, is any action that causes people pain, in the same way the property of redness is the quality of surfaces that makes people experience the sensation of redness. If this were the case, morality (or at least moral badness) would absolutely not be a subjective property.
Whether morality is objective or subjective depends on what you think morality is about. If it's about things that would exist even if we didn't judge them to be the way they are, it's objective. If it's about things that wouldn't exist unless we judge them to be the way they are, it's subjective.
Nobody used the word subjective. What are you on about?
So you legitimately don't recognize the screenshot as being fundamentally based around the issues of subjectivity and objectivity?
I mean.. what are you on about?
Probably in relation to the use of 'relative', I guess a synonym for subjective?
(Edit) I thought is was an interesting comment btw
I don't know, I might intellectually understand that morals are relative to a culture and that even our concept of universal human rights is an heritage of our colonial past and, on some level, trying to push our own values as the only morality that can exist. On a gut level though, I am entirely unable to consider that LGBT rights, gender equality or non-discrimination aren't inherently moral.
I don't think holding these two beliefs is weird, it's a natural contradiction worth debating and that's what I would expect from an ethics teacher
That's because there are 2 general schools of thought in ethics - relativism and absolutism. Relativism (the idea that morality is intrinsic to the person's experience and understanding) is the one that seems to be the most talked about in general society. I believe in absolutism, the idea that there is a set of guidelines for moral behavior regardless of your experiences or past.
Your example (more formally known as the paradox of tolerance) is what convinces me that absolutism is the better school of thought