Is Wolfram the Smartest Programming Language In the Room? (wolfram.com) 113
theodp writes: Out of the box, does your programming language support Chemical Formulas & Chemical Reactions? Making Videos from Images & Videos? Integrals? Real Numbers? Graph Trees? Leap Seconds? Bio Sequences? Flight Data? Vector Displacement Plots? Lighting? Machine Learning? Tracking Robots? Notebooks? Creating, Deploying and Grading Quizzes? Analysis of Email Threads? Access to 2,249 User-Defined Functions? NFTs?
These are just some of the feature upgrades Stephen Wolfram touched upon as announced the launch of Version 13 of Wolfram Language and Mathematica in a Dec. 13th blog post (for more, see What's New in Mathematica 13). Sign up for free access to Wolfram Cloud Basic here, kids! So, is Wolfram the "smartest programming language in the room"?
These are just some of the feature upgrades Stephen Wolfram touched upon as announced the launch of Version 13 of Wolfram Language and Mathematica in a Dec. 13th blog post (for more, see What's New in Mathematica 13). Sign up for free access to Wolfram Cloud Basic here, kids! So, is Wolfram the "smartest programming language in the room"?
Betteridge's Law continues to apply (Score:5, Insightful)
The answer is "No".
Wolfram is a shameless self-promoter, which means this piece fits in quite well in the le nouveau regime here.
Re: (Score:3)
He lost me at "2,249 User-Defined Functions".
By definition: User-Defined Functions are not supplied by the language.
(or does he mean users are limited to adding 2,249 functions to it?)
Re: (Score:2)
I think it means the implementation (and there's only one, because the language is proprietary) ships with 2249 functions in its standard library.
I guess they were too lazy to find one more function to implement so it could be a nice round number.
Re: (Score:2, Interesting)
A short search returns this (may not be fully correct, but...)
Python 3 has 68 built-in functions
Perl has about 230 or so if I'm counting them right
PHP has over 700 functions
Excel has about 500 functions (this doesn't really count, does it?)
I tried searching for the number of functions in Rust, C, C++, Lisp, javascript, and several others but no clear answers. Maybe someone else can shed some light on this. How many functions do typical languages have?
But 2249 functions? That seems like crazy overkill to me,
Re: (Score:3)
They're quite useful if you can't add external libraries. And insufficient even so.
OTOH, due to the license I didn't look deeply into this, so perhaps you can add external libraries.
Re: (Score:2)
Modern languages are not really structured around "build in functions".
Even C has none at all. C comes with some standard libraries like std-lib and math. And: those do not provide build in functions. They are obviously functions from a library.
Java has no single build in function but a gazilion methods/functions in it's standard libraries, like "Math".
Stuff like that you won't find with google, as google is bad in mapping from "function" to "method".
You could ask how many build in functions Lisp has, you p
Re: (Score:2)
C has zero built-in functions. You need to include a library for everything. That's not a bad thing. I believe Java is the same way.
Re: (Score:2)
Yes, but no, but yes. You don't have to explicitly import java.lang classes. So it can feel like they are built-ins. But they are indeed in a package (even if it's a built-in package), so you win.
Re: (Score:2)
Good point.
Re: (Score:2)
"The Wolfram Language has nearly 6,000 built-in functions." Find this at https://www.wolfram.com/langua... [wolfram.com].
If you think this is overkill then you have never used Mathematica which is probably true of most of the people commenting here. And you don't have any clue about how Mathematica is used or why anyone would choose to use it over Python, Perl, PHP, or Excel. Mathematica stands in a league of its own.
Re: (Score:2)
Here's one to make it a round 2250:
String FindSmartestPersonInRoom()
{
return "Stephen Wolfram";
}
Re: (Score:2)
Off topic: "the" and "le" are the same word, so if you wanted to say "the new regime" you can drop one of them. e.g. "quite well in le nouveau regime here",
Mine does (Score:2)
Out of the box, does your programming language support Chemical Formulas & Chemical Reactions? Making Videos from Images & Videos? Integrals? Real Numbers? Graph Trees? Leap Seconds? Bio Sequences? Flight Data? Vector Displacement Plots? Lighting? Machine Learning? Tracking Robots? Notebooks? Creating, Deploying and Grading Quizzes? Analysis of Email Threads? Access to 2,249 User-Defined Functions? NFTs?
Yes, the programming languages I use support all that out of the box, apart from NFTs because I only use serious programming languages.
I think you missed the out-of-the-box part (Score:2)
Sure these days every major language has every possible toolkit library. But making these all play nice with each other and keeping the hacks you need to glue them up to date as libraries deprecate or use incompatible data representations or require adjustments for multiprocessing or gpu can become a full time effort. So out-of-the-box becomes attractive.
I've always thought of wolfram as more of an IDE for logic and analysis than a programming language just as say Igor pro and Sigma and R and. Matlab are i
Re: (Score:3)
When I was younger I avoided becoming an expert in it because I thought it would never last and didn't want to commit to trusting things I didn't right myself and might become unsupported . But it turned out I was wrong and I kinda with I had used it more .
You would have to reimplement stuff due to changes in the language. Many of my old notebooks do not run anymore (even after "migration" by the newer versions) and I would need to change the implementation somewhat to account for language changes. Sometimes the changes are actually simplifications, things I had to do in the past (specifically load a library that no longer exists to get a function that is now standard) but even then the new form is not quite like the old one.
I use it because it is the fastest
Good points (Score:3)
I concur with what you are saying.
I am going through an exercise of getting a good QR matrix decomposition going in Java. You think that there would be comprehensive linear algebra libraries with C and Java source code by now, but there is an element of (excuse the pun) "some assembly required."
What appears to be a starting point for a number of Java linear algebra libraries is NIST's Jama package, but this gives a different numerical output than Matlab's built-in QR decomposition. For a rectangular
Re: (Score:2)
What appears to be a starting point for a number of Java linear algebra libraries is NIST's Jama package, but this gives a different numerical output than Matlab's built-in QR decomposition. For a rectangular matrix with columns greater than or equal to the number of rows, the Jama algorithm gives a sign difference in the matrix factors from what Matlab gives.
QR factorisation isn't unique. If M = QR, then you can complicate it as, M = Q D D^-1 R, assuming D is invertible.
If D is diagonal, and has entries o
Re: (Score:2)
That is what I am seeing. The signs of the last column of Q are flipped as are the last row of R, an effect that cancels out when recovering matrix A from the product Q R.
That gratuitous sign flip results from applying (in this algorithm) a Householder reflection to a column that does not require any of its rows to be zeroed. I suppose one could verify that all this unneeded, degenerate reflection does is flip the signs in those places (what is it doing? reflecting across a zero-length vector?), but t
Re: (Score:2)
I recall when I started using excel for analysis, my professor was quite unhappy because I did not justify or validate the result. It was just magic. And believe we wrote some software back them that was lead to spurious results. Even when we used the proper libraries, like IMSL.
Wolfram writes scientific software
Might be .. (Score:1)
.. but it has failed to change the world in terms of market dominance, where I clearly see Matlab, Pyhton taking the big chunk.
So if you define "smartest" in sense of, if the Language is it even smart engough (marketed/designed) to gain market dominance?
The answer is: "No:", it's as dumb as the rest is shit.
But nevertheless Mathematic is a great Application with a cool language for a specific domain.
Added support for NFTs (Score:3)
That is called "bloated", not "smart". (Score:4, Informative)
All this stuff belong into libraries. Otherwise you overload the syntax and semantics of the language and that is a very bad idea.
Bloat isn't a problem any more (Score:2, Interesting)
In the olden times of 640k computers the size of the code mattered. These days memory and caches have made it so we work with data sizes and data fluxes that make the size of tfe code irrelevant . Unused code will be pushed to virtual memory . Sure for lean production systems you could get bye with a gigabyte less memory but a gigabyte stick is irrelevant in cost for anything that uses a user interface. It costs less than an hour of a scientists total salary compensation. To bloated code is not somethin
Re: (Score:2)
Read my statement again. I am not talking about memory footprint.
Re:Bloat isn't a problem any more (Score:5, Insightful)
Ah, we found the guy who develops the 10 GB to do apps and 500 MB web pages.
Next time you're downloading that multi-gigabyte patch remember, it's this attitude you have to thank.
Re: Bloat isn't a problem any more (Score:1)
Bloat is always bad. (Score:4, Insightful)
Re: (Score:2)
Bloat means complexity, which is why we can't have secure and reliable software.
Pretty much.
Re: (Score:2)
In the olden times of 640k computers the size of the code mattered.
640K? I can remember 16K was standard and 64K was considered a luxury. You actually had to think about design and how to minimize lines of code.
Re: (Score:2)
Hell, my first computer had but a couple of hundred BYTEs of RAM and a sub-1MHz clock speed. Kid today! (sighs). :-)
Re: (Score:3)
Hell, my first computer had but a couple of hundred BYTEs of RAM and a sub-1MHz clock speed. Kid today! (sighs). :-)
Luxury! My first computer was a 3-bit, mechanical, state machine (Digi-Comp I) [wikipedia.org] and I loved it. I still have it and it works!
Re: (Score:3)
Hell, my first computer had but a couple of hundred BYTEs of RAM and a sub-1MHz clock speed. Kid today! (sighs). :-)
Luxury! My first computer was a 3-bit, mechanical, state machine (Digi-Comp I) [wikipedia.org] and I loved it. I still have it and it works!
I had one as well but my parents got rid of it at some point. Neat machine...
Re: (Score:2)
We used abacuses and were grateful for them. You guys with your fancy 'RAM'.
(And we had to walk uphill both ways to school, in the snow!)
Re: (Score:2)
Minecraft must have been painfully laggy.
Re: (Score:2)
Minecraft must have been painfully laggy.
Now that people are implementing CPUs entirely out of Minecraft blocks [youtu.be], the next step will be to compile a Minecraft executable that can run on one of those CPUs... and at that point we can close the loop and it will be Minecraft simulations all the way down.
(ObDouglasAdams: Some speculate that such a thing has already occurred, and we are living in part of it)
Re: (Score:1)
Actually you hadn't.
At that time no one had problems in his imagination that needed 100ds of kilobytes or megabytes or even gigabytes.
The problems we solve grew with the machines we have available.
I programmed a loton Apple ]['s etc. The main problem never was lack of memory but the odd layout of the memory with a screen buffer for text somewhere in the middle and 2 buffers for graphics somewhere else. Ofc. Basic and Pascal abstracted that away.
This stupid idea of "people had to take care of memory usage" i
Re: (Score:2)
Actually, there was no limit to the size of your code in pascal on a 640KB memory computer. You just couldn't load it all at once in memory. You would load part of it, use it, then load another part of the code into memory use it, etc. etc.
In Borlan turbo-pascal, it was called memory overlays. With it, you could write programs that would have used tens of MB of memory if it would have been loaded all at once into memory.
Re: (Score:2)
UCSD Pascal did the same.
Funnily there even was a few MB hard disk for Apple ]['s once.
No idea if the UCSD runtime system could cope with one, but I guess it would.
Floppies unfortunately topped out around 140kB.
Re: (Score:2)
Re:That is called "bloated", not "smart". (Score:5, Informative)
All this stuff belong into libraries. Otherwise you overload the syntax and semantics of the language and that is a very bad idea.
And all this stuff is in libraries (even if automatically loaded). The core language syntax of the Wolfram Lisp-like language has been stable for decades now. As with Python or Spark (or Java for that matter) there are libraries of functions that get invoked to perform operations. We used to have to load a lot of Mathematica libraries explicitly, but mostly (thanks to more capable hardware) we only need to do that for the more specialized or advanced operations (most ML for example).
TFA is really off-base (or at least its billing, I didn't read) in that Mathematica/Wolfram is a computational system (Mathematica) written in a language (wolfram) but they are not separable in anyway, they are a single product. So it can't be compared to "languages" directly as it is apples and pears.
Re: That is called "bloated", not "smart". (Score:1)
Depends. One could argue that Matlab and its various open source clones and lookalikes such as octave and scilab are also bloated and overloaded because you can do anything from integer arithmetic to complex-valued multidimensional numerical quadrature and symbolic manipulation with syntax that's almost, if not exactly, identical.
But then again, a lot of the history of formal mathematics has been about taking "abuse of notation" and turning it into new theorems and ways of thinking about the structure of nu
"Out of the box" is just a packaging question (Score:3)
Why focus on that? Many languages have modules for these things. How the language is packaged is a silly focus.
The "A New Kind of Science" guy? (Score:1)
Yeah, I remember the hype around that book. I'm sure we all thought humanity was on the cusp of dominating the universe.
Let's take a look around, shall we?
Define "smartest" (Score:3)
Re: (Score:2)
640K functions should be enough for anyone.
Re: (Score:2)
Re: (Score:2)
What's yer point?
I guess it's highlighting your inability to understand abstract humor.
Does my language need to? (Score:2)
My language of choice is happily optimized for the industrial control domain it works in. And I see no reason why end user applications such as a crane designed to lift 100 tons needs to deploy quizzes or analyze email chains etc.
Re: (Score:2)
This language is happily optimized for the domain it's for :) Namely mathematics.
So you're 100% correct!
it's a client (Score:2)
It's not a programming language, it's a client for a third party math service.
Re: (Score:2)
But I too have an issue calling "Mathematica" a programming language. In my view, it is a huge math application that happens to be programmable. It is not the right tool for most programming tasks that are not mostly about scientific/math stuff. And it is way to expensive to tell people "oh w
Re: (Score:2)
Perhaps you want to check what you earn per month and what a one month license of "insert your software here" costs.
Big Languages Restrict Creative Thinking (Score:5, Interesting)
While I do appreciate and use Wolfram Alpha, I'm also aware of its limitations. For me, the greatest Alpha limitation is that I have to do things "The Alpha Way".
In Python, I'm free to bash my code together with libraries, to use them in creative ways that present some powerful advantages. I've done some horribly wonderful mashups of, say, Pandas and PyTorch with real-time signal processing and control loops.
Both Python and Alpha have "cognitive load" and "technical debt" associated with them, but it's presented very differently. In Python I have access to several linear algebra packages, and while learning each can be a pain, being able to switch between them lets me get my job done better. In Alpha, unless you dig deep into the underlying system, your math will be solved using whatever package Alpha picks for you, which may be a great choice, but it is still done behind the scenes.
Alpha also has trouble working outside the Alpha environment, say, to construct and deploy a robotic control system. Mathematica has code export capabilities, but there are gaps moving from Alpha to Mathematica to C++, and the process is rather frustrating. Python lets me go from MicroPython all the way up to supercomputers, though not seamlessly, there is a delightful level of commonality.
Which, for me, means I use Alpha more like a really big and powerful calculator, where I just want a useful answer more than I care about the code used to generate it. Alpha is also a wonderful playground to use to gain mathematical and data science proficiency in an interactive and intuitive way. But that's high-level proficiency, not something you can directly ship in a product. Alpha is more amenable to basic research when you don't know what the solution path looks like. If you do have a clue, then Python (especially in Jupyter Notebooks) can get you there in multiple ways.
Re: (Score:2)
*Every* time I have tried to get Wolfram Alpha to answer or analyze in a non-trivial way, I have failed. It may be possible to figure out how to structure a query in such a way that it can actually provide the answer I'm looking for, but it hasn't happened yet, and I'm pretty good at unambiguously defining a problem in English. That's a pretty bad failure rate for a service that's supposed to be able to answer natural language queries.
I really can't figure out who Alpha is for - the Mathematica people don
Re: (Score:2)
Re: (Score:3)
I was delighted, charmed actually, by the Alpha port created for and shipped with the Raspberry Pi. It is surprisingly capable and performant. I want every middle schooler to have one of their own, as a STEM Gateway Device that would encourage them stretch themselves through high school.
Python, too, certainly. But the whole on the RasPi is greater than the sum of its parts.
Re: (Score:1)
Just use the right tool for the right job (Score:2)
Re: religious movement (Score:2)
Is has been a religious movement ever since someone wrote a payroll program in FORTRAN instead of COBOL.
Re: (Score:3)
Agreed. And I'd be far more likely to consider Alpha/Wolfram if it (like nearly all other modern data platforms except SAS and Matlab), was open source and allowed me to deploy it in the field on whatever hardware and OS I choose, with no cloud dependencies.
There are some interesting ideas in the Wolfram language, but there's a non-trivial learning curve, and at the end of the day it's still a proprietary environment. Sometimes that makes sense, like Oracle can sometimes make sense, but in general, the he
Re: (Score:2)
we will see a significant shift from Linux to one or more of the BSDs ...
Eveyone hates systemd
Sounds like the CISC of languages (Score:2)
That is all.
Definitions of "Powerful" (Score:4, Interesting)
For over a decade, I was captured by FORTH, for reasons comparable to Paul Graham's article about LISP in "Hackers and Painters". Both are very simple, and low-level, but can swiftly build up large functions quickly, in amazingly terse code; and Graham identifies the frightening power of self-modifying code (also easy in FORTH) as the secret superpower.
The discussion didn't involve *any* built-in functions, just the ability to create complex logics.
But in my career, darn near nothing was actually best-solved by FORTH; I was rarely striking out to invent whole new computing concepts, handing people their own development tools (Graham's riches started with the first web application that let you design your own online "store", and 25% of the code was self-modifying). Mostly I was solving already-solved engineering problems, just with a computer, automated.
A big pile of pre-written tools, and basic programmability, was the need 100x as often - for me, and I think most programmers outside of university postgrad-level. For us, Wolfram would be better than LISP.
Alas, the competitor there is "Excel":
1) Two kinds of "basic programmability" - the functional programming that is a spreadsheet, the closest to FORTH/LISP-like programming most of us will do. Functional programming is awesome if you use it fully. And, where you just have to go procedural, there's VBA. People hate VBA, but 99% of procedural needs can be met with it.
2) Also the giant toolbox, not just of math functions, but Excel spreadsheets can also embed photos and videos (and control them with VBA), can do all kinds of network communications, talk to commonly-used databases. (Excel is a "good enough" database server client that half the little-database needs don't need Access, not really.)
If you can meet most of your number-grinding needs with the software already installed - and installed by everybody in the office, to whom you can send your spreadsheet file, assured they can open it and manipulate it - it takes a pretty major lack in Excel for you to jump to a whole new proprietary environment.
All techies have some "Tim the Toolman" in us, we want the Binford 9000 Microwave that can cook a bull elephant, we want the M1 tank - just in case. But few of us actually need them.
Re:Definitions of "Powerful" (Score:4, Insightful)
Languages like FORTH and LISP are still worth learning, because they teach you a new way of thinking about programming, which you will carry over to your work in other languages.
In most cases when we choose a language, it's not because of the language, it's because of the libraries.
Re: (Score:2)
For over a decade, I was captured by FORTH, for reasons comparable to Paul Graham's article about LISP in "Hackers and Painters". Both are very simple, and low-level, but can swiftly build up large functions quickly, in amazingly terse code; and Graham identifies the frightening power of self-modifying code (also easy in FORTH) as the secret superpower.
This was always my favourite way of trolling smug lisp weenies. Whenever they brought up the self modifying code thing I'd frown deeply like I was thinking b
I know who thinks so (Score:1)
performance is everything (Score:2)
The real question is how does it perform. Because coming with a bunch of things out of box is not as important as pure raw performance. Otherwise you can just implement the same Primitives with the more performance language and win in the end.
How smart is that?
There's an Emacs mode for that (Score:2)
So no need to install some nutty thing from a nutjob who thinks he's god.
You know what they say? (Score:2)
If you're the smartest in a room, you're in the wrong room.
Real Numbers (Score:2)
Re: (Score:2)
There are no programming languages that support real numbers. All actual number representations (in hardware, or written down) are in effect rational numbers, that approximate real numbers. Floating point numbers have an integer multiplier, that has finite precision. It is physically impossible to represent a real number such as Pi or sqrt(2), because that would require an infinite number of digits.
Having said that, you can do maths with symbols that stand for real numbers, without actually representing the
Shhh! (Score:2)
We don't need the C++ committee to add all of those to the next standard!
Seriously, though, most of that should be supported by libraries not the language itself. Languages should be lightweight but flexible so that new libraries feel more like an extension to the language. Heavy languages tend to be problematic and murder on optimisers.
Mehh (Score:1)
Rich, lazy ... then this is the program for you (Score:2)
It's a niche language who's real difference is that it excels at symbolic math (it's original purpose if memory serves).
Everything else that it does can be done as well or better by freely available tools.
Re: Modern computers have turned into surveillance (Score:2)
Dude I don't think there is a single person in all of slashdot that has actually read even a single one of your mad memoirs.
Re: (Score:1)
His regular screeds against "big tech" (or whatever is twisting his panties) are tiresome and boring.
No one cares about his victim mentality and the terrible, terrible 'oppression' he faces trying to log into Facefuck or Pimpstagram or whatever.
In summary, he's a pathetic douche that needs to get a real hobby.
I told my son, "Go to school or you'll end up like that guy." It scared the shit out of him and now he's in college.
Re: (Score:1)
Here's a suggestion. Try speaking for yourself only. Not the rest of humanity. Or even the rest of a particular group, such as slashdot readers.
If you do that, people might have more respect for what you say.
Re: (Score:2)
If you do that, people might have more respect for what you say.
I really don't care if people respect what I say or not.
Make your own choices instead of being driven like a donkey by the opinions of others. Do you really allow others to dictate your actions based on what they think of your speech? Because that's sad.
Re: (Score:1)
I think that if you look at my post history you will find that to absolutely not be the case. I regularly voice quite unpopular opinions here on slashdot.
I also walk around unmasked, unvaccinated and fearless while surrounded by a multitude of fear-driven people who have never seen a virus and could not even begin to describe the cell-culture poisoning process by which "virologists" fool themselves and the public int
Re: (Score:2)
I also walk around unmasked, unvaccinated and fearless
Oh, so you're an anti-vax moron. No wonder you came across as ignorant.
-
while surrounded by a multitude of fear-driven people who have never seen a virus
I've seen a virus more plenty of times, using SEM and TEM equipment. They exist even if you don't believe in them. That's the beauty of science- it's works whether or not you're smart enough to understand it.
-
and could not even begin to describe the cell-culture poisoning process by which "virologists" fool themselves and the public into "germ theory" mass delusion.
Oh, yeah, let me guess- you "do your own research" at the Google Medical School and you majored in Bing?
Be gone, plague rat.
Re: (Score:1)
You may have seen exosomes, ie cell debris. If so, congrats. Not many people personally have.
You have never isolated what you believe to be "viruses" from a sick organism and then inoculated them into a healthy tissue (or living organism) and observed it to become ill with the same symptoms as the first.
I know you have not done it, because no one has ever done it. Do you say otherwise?
If you are a "virologist", you may have performed a cell culture, involving 6 other sources of biological material, witho
Re: (Score:2)
You have never isolated what you believe to be "viruses"
Ah, moving the goalposts, just like any Trumptard does when cornered by facts and reason. LOL
You never said anything about "isolating blah blah blah", but you just can't admit that you got shot down in flames, can you? lol
Yes, I have indeed seen viruses, using a JEOL TEM/SEM electron microscope. Go look it up, or have your mom do it for you.
-
If you believe so strongly in the reality of viral contagion, I suggest you go get that reward. Go ahead, I'll wait.
Wait all you want, plague rat. It's not my job to disprove your whackadoodle conspiracies and drain your vast reservoir of scientific ignorance.
The fact is that I've fo
Re: (Score:1)
I've never said that tiny things cannot be seen in a microscope.
What I am saying is that it is not possible to distinguish a "virus" from exosomes aka extracellular vesicles, because the latter have been misidentified as the former. They are one and the same, and are not contagious or a cause of disease.
For anyone who wishes to prove otherwise, the experiment is simple:
1. isolate what you believe to be a "virus" from fluids of a sick person.
2. inoculate said virus into healthy human tissue (with no other
Re: (Score:2)
If you're right, where's your Nobel Prize? Hmmm?
Seriously, if you're correct, your 'findings' would upend the entire field of biology, and you'd be world-famous overnight. You'd be rich beyond your wildest dreams. So...where's your Nobel Prize, buddy?
Let's face it- you're just another unvaccinated plague rat who thinks he's smarter than all the scientists in the world.
You're literally no different than the Flat Earth kooks who litter Youtube with their 'proofs' that 'the globe is a lie' blah blah blah.
In fa
Re: (Score:1)
These are not "my" findings. I'm just a messenger. But having read the works of the researchers advancing this theory (past and present), I find myself agreeing that it makes much more sense than germ theory. And I see quite clearly that germ theory was never proven or even demonstrated in any logical, reproducible fashion.
So even if advocates of the "new biology" are wrong about their theories of how disease occurs (terrain, water, energetics), that still leaves germ theory dead in the water. If you lo
Re: (Score:2)
These are not "my" findings. I'm just a messenger.
In other words, you have nothing to back this up and are relying on the unproven theories of fringe crackpots who can't withstand peer review.
Re: (Score:1)
I back this up with my own reason and logic. I have looked at germ theory and at how virologists perform their experiments and come to the conclusions they do. I find their logic and methods to lack any kind of scientific rigor and I disagree with their conclusions. Starting with Ender's 1953 and 1957 papers that laid out the methodology use by virology ever since.
I do not care if you and every other human being on the planet agrees with the virologists. To me, their methods are flawed and do not make
Re: (Score:2)
Now you're just boring me.
Have a just a shred of empathy for your fellow humans and get vaccinated.
Now you'll tell me the vaccine "doesn't work" or it's "not a vaccine" or "it's not really been tested" or you're allergic or your immune system is super-duper...blah blah BLAH.
If it turns out that you're right and all of germ theory is bullshit, I'll apologize. Really.
Somehow I doubt that'll happen.
In the meantime, I repeat: have a just a shred of empathy for the people around you, and get vaccinated.
Re: (Score:1)
So you would ask me to put something I believe is actively harmful to myself directly into my bloodstream, for the supposed good of others? Even when I believe it to be harmful for others as well. And let me guess, next you would want me to inject it into others, such as children?
And if I refuse, what then? Would you advocate forced injections? I just wonder where you draw the line.
Somehow humanity and all other creatures on this planet have managed to survive for thousands (at least) of years witho
Re: (Score:2)
I wouldn't force you to do anything, but I wouldn't let you in my store or my home and I certainly wouldn't hire you.
Also, if I knew in advance that you were an anti-vaxxer I probably wouldn't buy anything from you, either. I wouldn't patronize your business.
It's the Free Market at work and your ideas aren't selling well at all. Maybe hire an image consultant.
Do you deny the vaccine works? Are you denying the numbers of per capita deaths between the vaccinated and the unvaccinated?
Re: (Score:1)
I applaud that you do not support forced vaccinations. That puts you a step above many.
I reject germ theory. It is simplistic, flawed, outdated. You can work out the ramifications of that for yourself.
You could also take this as an opportunity to learn about an alternative paradigm, but that's up to you. I've provided you with plenty of starting points.
People have been miseducated about germ theory all their lives, and feel invested in it. It challenges one's sense of identity to think it could be wro
Re: (Score:2)
If it's proven to be wrong, I'll go, "Damn, whaddya know, I was wrong" and I'll move on.
It's happened before, for example with blue LEDs. I was taught in tech school that they would never be able to make a blue LED because of (insert all sorts of reasons). They were proven to be wrong, and when they were I went, "Damn, whaddya know, I was wrong" and I moved on.
I have no problem shifting my paradigm when the circumstances warrant it, like when the evidence indicates it's time for a shift.
I don't think germ t
Re: Modern computers have turned into surveillanc (Score:2)
Re: (Score:2)
I did notice that, but that's exactly what I expect from cranks and people who claim to "know the truth" about things that we already know the truth about. They can never answer a direct question without a cavalcade of bullshit to accompany it.
Honestly, this guy is no different than the Flat Earthers, the "Moon is a hologram" kooks, or the nuts that claim the planet is secretly ruled by Reptilian overlords. (If, in fact, the Earth was ruled by Reptilians, it'd be working a hell of a lot better than it is no
Re: (Score:1)
I have. I enjoy his perspective, even if not agreeing with all of it. I think he raises some points worthy of discussion, though often off-topic.
Re: (Score:2)
Re: Modern computers have turned into surveillanc (Score:2)
But you're also not a person, so that doesn't count.
Re: (Score:1)
ok then...
Re: (Score:2)
Re: (Score:2)
Hopefully Web3 will be able to bill us automatically for whatever it thinks we want.