Nonlinear Neural Nets Smooth Wi-Fi Packets 204
mindless4210 writes "Smart Packets, Inc has developed the Smart WiFi Algorithm, a packet sizing technology which can predict the near future of network conditions based on the recent past. The development was originally started to enable smooth real-time data delivery for applications such as streaming video, but when tested on 802.11b networks it was shown to increase data throughput by 100%. The technology can be applied at the application level, the operating system level, or at the firmware level."
Improves network performance (Score:5, Funny)
Re:Improves network performance (Score:4, Funny)
Checks, Governing circuits, etc. (Score:5, Funny)
Re:Checks, Governing circuits, etc. (Score:3, Informative)
Re:Checks, Governing circuits, etc. (Score:2)
You're joking, right? We've barely begun to understand the human mind, and you think we can create true intelligence with a computer program? I don't deny that it will eventually happen, but it's certainly not going to be soon.
To paraphrase a smarter man than I, the AI field is still lacking its Einstein.
Re:Checks, Governing circuits, etc. (Score:2)
Re:Checks, Governing circuits, etc. (Score:4, Funny)
Re:Checks, Governing circuits, etc. (Score:5, Funny)
this hurts my geek cred (Score:5, Funny)
Yahoo Serious Festival (Score:4, Funny)
Re:this hurts my geek cred (Score:5, Informative)
Simple 'nuff, really...
Neural net - An arrangement of "dumb" processing nodes in a style mimicing that which the greybacks of AI (such as Minsky and Turing et al) once believed real biological neurons used. Basically, each node has a set of inputs and outputs. It sums all its inputs (each with a custom weight, the part of the algorithm you actually train), performs some very simple operation (such as hyperbolic tangent) called the "transfer function" on that sum, then sets all of its outputs to that value (which other neurons in turn use as their inputs).
Nonlinear - This refers to the shape of the transfer function. A linear neural net can, at best, perform linear regression. You don't need a neural net to do that well (in fact, you can do it a LOT faster with just a single matrix inversion). So calling it "nonlinear" practically counts as redundant in any modern context.
Smooth - A common signal processing task involves taking a noisy signal, and cleaning it up.
Wi-Fi - An example of a fairly noisy signal that would benefit greatly from better prediction of the signal dynamics, and from better ability to clean the signal (those actually go together, believe it or not - In order to "clean" the signal without degrading it, you need to know roughly what it "should" look like).
Packets - The unit in which "crisps" come. Without these, you can't use a Pringles can to boost the gain on your antenna to near-illegal values.
There, all make sense now?
Re:this hurts my geek cred (Score:3, Informative)
Re:this hurts my geek cred (Score:2)
Indeed it does... You couldn't really do much with the raw received signal at a higher level than the device's firmware, and you'd probably want it even lower than that (such as in the actual analog hardware).
But that didn't fit as nicely into a pun involving Pringles.
Re:this hurts my geek cred (Score:5, Informative)
I put that in the past tense for two reasons...
First, at least the followers of Minsky have apparently deemed connectionist learning models as passe. In fact, as far as I can tell, the very field of artificial intelligence has shifted away from the "intelligence" part, preferring to focus on (the far more marketable) automated problem solving and classification, rather than trying to mimic aspects of actual consciousness.
And second, neurophysiology (rather than AI researcher) has all but obliterated the hope that any basic variation on the standard multilayer feedforward neural net really does all that great of a job at modeling the brain. It seems that real neurons do some pretty impressive processing, each having a local store, exceedingly fine-grained delay lines, self-feedback (at the signal, rather than just the obvious neurotransmitter level), and some degree of actual flow control. And that just mentions what we know, they may have quite a good many more secrets waiting for someone to notice... For example, recently, a few bright folks noticed that glia, the non-neuronal cells making up literally half of our brains, might do more than just sit there and take up space.
Please confine your answer to words of less than ten syllables
I apologize for the length of words in this domain, but I didn't make them up, we all just inherited them from people who liked Latin and nominalization waaaaaaaay too much. <G>
Re:this hurts my geek cred (Score:3, Informative)
Just one point.
'One way' classical layered models can be said to be 'passe', but recurent o 'looped' conectionist models are far from being understood, in fact, are a great source of advances.
What's in a sig?
Re:Linear neural networks! (Score:2)
According to the website (Score:2, Funny)
Hahaha (Score:3, Interesting)
Re:Hahaha (Score:3, Insightful)
Re:Hahaha (Score:3, Insightful)
Damn... (Score:5, Funny)
1.Smooth
Fuck...
Re:Damn... (Score:4, Funny)
You don't know what nets are??
How do you catch stuff?
Re:Damn... (Score:2)
Simple...don't use a condom!
Re:Damn... (Score:2)
Austin: "Only Sailors use condoms, baby!"
Vanessa: "Not in the 90's, Austin!"
Austin: "Well they should the filthy beggars, they go from port to port!"
(boom-boom!)
Re:Damn... (Score:2)
Re:Damn... (Score:4, Funny)
Anyway, I'm only replying because you're being "insigful" and it made me laugh.
"Insighful" would be moderation for when a comment makes you sigh. Hmm..
Why Neural Networks? (Score:5, Insightful)
Re:Why Neural Networks? (Score:4, Insightful)
Well, it's quite obviously because a Support Vector Machine is inherently linear, and to make it nonlinear, you must insert a nonlinear kernel which you need to select by hand.
If you'd read the article, you'd see that they are using a recurrent-feedback neural network; good luck finding a recurrent-feedback nonlinear kernel for a SVM....! You can't just plug in a radial bias function and expect it to work. In this application, they are looking for fast, elastic response to rapidly changing conditions as well as a low tolerance for predictive errors--something an RFNN is ideal for, and that a SVM is absolutely terrible at.
Re:Why Neural Networks? (Score:5, Interesting)
Well, it's quite obviously because a Support Vector Machine is inherently linear, and to make it nonlinear, you must insert a nonlinear kernel which you need to select by hand.
Not true [warf.org].
"This invention provides a selection technique, which makes use of a fast Newton method, to produce a reduced set of input features for linear SVM classifiers or a reduced set of kernel functions for non-linear SVM classifiers."
Re:Why Neural Networks? (Score:5, Interesting)
I figure the real reasons they use NNs are much simpler. Firstly, its really easy to implement NNs that predict numeric values instead of classes and even more importantly they work. Research usually involves trying everything under the sun and reporting/patenting/exploiting whatever worked best.
Re:Why Neural Networks? (Score:2)
FYI, sigmoid is not a kernel. It is not positive definite.
not really (Score:3, Interesting)
Re:not really (Score:2)
Later addition? Nonlinear kernels were already used even before the SVM was called the SVM. See here [psu.edu]. Perhaps you refer to all tutorials, which make it look like it was a later addition.
Re:Why Neural Networks? (Score:2)
Re:Why Neural Networks? (Score:2, Interesting)
I hear all too often from people in the field of machine learning who get their favourite solution (SVMs and NNs are the most common) and then they go hunting for a problem.
It might not be exactly the best technique, but if at the time it was the easiest to understand and use, and gave really good results, then the right decision was made.
Is that the difference between theory and practice right there?
Re:Why Neural Networks? (Score:2, Informative)
You might be able to build SVM implementations relatively easily on a real computer using off the shelf libraries etc., I doubt many of these would run on a WiFi card.
Neural nets have also been around for quite a while, so they have gained acceptance. Although SVMs have been known to the machine learning community for quite a while now, they have only just started being
Re:Why Neural Networks? (Score:2)
Re:Why Neural Networks? (Score:2)
As for not using SVMs, I seem to recall that they are better suited for classification than for para
Re:Why Neural Networks? (Score:2)
Nature may be non-linear but more often than not linear is a damn good approximation. F=-kx anyone?
Re:Why Neural Networks? (Score:2)
science by popularity (Score:2)
I challenge you to give a mathematical justification of why you think that support vector machines would be better in this application than neural networks. While SVM papers fill
Re:What? (Score:3, Insightful)
Not gonna happen. The poster was just using random random terms that have nothing to do with this article, trying to sound smart, and is probably laughing as the post gets moderated up.
Everything the poster mentione, such as Naive Bayes [google.com] and Support Vector Machines [google.com] are used for static tasks, like classification, not for realtime feedback situations. They learn once and predict forever. They don't learn iteratively and keep changing. Follow the Google links I just gave and
Re:What? (Score:4, Interesting)
There are online methods using both the techniques you mention. The theory is usually a little more involved, so you're not likely to get a good tutorial from page 1 of google results.
Try MIT's open courseware (Machine Learning course) [mit.edu] for some better explanations of this stuff, if you can handle the maths, ughhh.
Re:What? (Score:2)
A simplification, perhaps, but he gets to the point as it matters in this applications - Most of the "suggestions" as better alternatives to an FRNN do classification, not numerical output (and thus the implication, that someone just wanted to toss out cool-sounding names as a form of karma-whoring).
Sure, you could hack them up to have the classes represent the integers, but why not just start with a technique that already handles purely numeric data well? Anyone w
Some suggestions (Score:2)
Kernel machines are actually quite good at handling nonlinear regression problems.
WiFi? How about... (Score:2)
Re:WiFi? How about... (Score:2)
Skeptic (Score:5, Insightful)
just as a selling point because it sounds
like something extremely advanced and "related
to artificial intelligence".
usually the neural network is just a
very simple, possibly linear, adaptive filter
which means that really contains no more
than a few matrix multiplications
yes it has some success in approximating
things locally, but terms like "learning"
are really misused
After RTFA (the second) it actually
seems that they did try two or three
things before, but really i wouldn't
"welcome our new intelligent packet sizers overlords"
just yet.
Re:Skeptic (Score:5, Funny)
that from a mobile
phone or do
you just like to
hit enter after
every couple of
words as some
sort of nervous ti
ck?
Re:Skeptic (Score:4, Insightful)
The simplicity of the calculation does not mean it is not a learning algorithm. Real neural networks are quite simple, as each "neuron" is simply a weighted average of the inputs passed through a sigmoid or step function. However, en masse they perform better than most other algorithms at handwriting recognition. They take a training set and operate on it repeatedly, updating their parameters, until some sort of convergence is reached. Their performance on a test set is a measure of how well they have learned. This is a learning algorithm.
Even linear regression is a learning algorithm. You give it a bunch of training data as input (i.e. x,y pairs), iterate on that data until it converges, and is then used to predict new data. There happens to be an analytic solution to the iteration, but this does not make it any less of a learning algorithm.
I think maybe your definition of "learning" is unnecessarily strict. The simplicity of the computation is not what defines this category of algorithms.
Re: Skeptic (Score:5, Informative)
> usually the neural network is just a very simple, possibly linear, adaptive filter which means that really contains no more than a few matrix multiplications
No one in their right mind would use a linear ANN, since ANNs get their computational power from the nonlinearities introduced by their squashing functions. Without the nonlinearities, you'd just be doing linear algebra, e.g. multiplying vectors by matrices to get new vectors.
As for the computational power of ANNs,
> yes it has some success in approximating things locally, but terms like "learning" are really misused
"Neural network" and "learning" are orthogonal concepts. A neural network is a model for computation, and learning is an algorithm.
In practice we almost always use learning to train neural networks, since programming them for non-trivial tasks would be far to difficult.
Prisoner's Dilemma applied to networks flows (Score:4, Interesting)
Re:Prisoner's Dilemma applied to networks flows (Score:2)
"Two suspects are arrested by the police. The police have insufficient evidence for a conviction, and having separated them, visit each of them and offer the same deal: If you confess and your accomplice remains silent, he gets the full 10-year sentence and you go free. If he confesses and you remain silent, you get the full 10-year sentence and he goes free. If you both stay silent, all we can do is give you both
Re:Prisoner's Dilemma applied to networks flows (Score:2, Informative)
Chartsengrafs (Score:5, Informative)
http://web.ics.purdue.edu/~dphillip/802.11b.gif [purdue.edu]
For a little explaination, where it says "Node 50" or "Node 100" that means that there are 50 or 100 computers on the wireless network. And the throughput numbers are for the whole network, not per host. So when 100 nodes are getting 3.5 Mbps that's
Thanks to professor Park
Why wireless only (Score:5, Insightful)
Re:Why wireless only (Score:2, Informative)
The normal internet has far less collisions & errors than wireless ethernet. And ethernet switches are now so cheap that it isn't worth your money to buy ethernet hubs.
And the advantage here is that it is (allegedly) a successful predictive model of whether to use big or small packets, and not r
Re:Why wireless only (Score:2, Informative)
And the advantage here is that it is (allegedly) a successful predictive model of whether to use big or small packets, and not reactive.
If you react to errors, you can resend, but you've already wasted bandwidth. If you can avoid the error in the first place, it's much better! :)
It predicts based on past performance therefore it is reacting. The savings on switching packet size is based on resending small packets instead of resending large packets. Losing a single small packet is not nearly as bad as
Re:Why wireless only (Score:2)
Re:Why wireless only (Score:2)
Ethernet can use CSMA/CD to deal with this kind of thing and hit around 97% throughput on wires. Wifi can't use the CD part (collision detect) as to transmit and receive at the same time requires very expensive equipment, so it can't tell if a collision has occurred while it's transmitting.
The rest of the Internet is made up of either point-to-point links which won't have collisions as there's only two stations or other wired connections that can use CSMA/CD or simi
Quick summary for those too lazy to RTFA (Score:4, Informative)
QED
EE Times Article (Score:4, Insightful)
This sounds like a great improvement to 802.11x technology...now let's open-source it so we can all benefit!
Buzzword-powered network (Score:5, Funny)
"Introducing iFluff/XP: An XML-based Object-oriented neural networking system that will synergize the modular components of your SO/HO WAN protocols, while minimizing TCO and giving five 9's reliability by branch-predicting streaming traffic through your SAN, NAS, or ASS.
iFluff/XP allows you to commoditize and monetize the super-size networkcide as rogue packets from black hats and white hats and clue bats compete for cyber-mindshare of your Red Hat hosts.
Secure your Homeland LAN and manage your digital rights with dignitude and affordability with the help of iFluff/XP's bytecode-based embedded operating system protocols interfacing through broadband Wi-Fi connectivity and virtual presense frameworks.
A user-friendly GUI is provided through an XSLT module interfacing to leading industry applications such as Mozilla,
When you're thinking of buzzword-compliiant, ISO9001 conformant, remotely-managed turnkey security solutions, remember iFluff.... TO THE XXXTREME!"
Oh god, my brain hurts now.
Re:Buzzword-powered network (Score:2, Funny)
(sigh) I've had that last one, and it ain't fun. Imodium usually clears it up quick though...
HIRE THIS GUY RIGHT NOW (Score:2)
which end? (Score:2)
Re:which end? (Score:3, Interesting)
Re:which end? (Score:2)
According to the article, you can reap the benefits through a simple firmware upgrade (or even through an application). Since I don't know how the 802.11b standard works, I can't comment on whether you would need to upgrade firmware on both
Re:which end? (Score:2)
Standing back a bit, this is pretty shiney. A "simple" firmware update has the potential to double the throughput of existing equipment, so long as there's a bit of spare processing power available. I love anything that can use spare CPU time to double storage or throughput, especially in embedded systems.
Re:which end? (Score:2)
Re:which end? (Score:2)
Re:which end? (Score:2, Funny)
patents on the technology (Score:2)
Why needed? (Score:2)
LOL (Score:5, Funny)
Re:LOL (Score:2, Funny)
OT: Does anyone know if they use this stuff for TA (Score:2)
I worked in the signal processing / neural net area a while ago, and it wasn't ready for prime time, then.
Does anyone know for sure if there are commercially viable AI's making money on stock market technical analysis (TA) yet?
Looks sketchy to me (Score:2, Interesting)
I'm pretty sure that's not the case. Besides, if the technology you're pushing boils down to 'variable-sizing', seems like someone's thought of
You dont need much of a neural net anyways (Score:2)
- general slow stuff like telnet
- general high utility stuff like ftp or http file transfer
- bursted traffic like web surfing or email checking
It seems highly unlikely that you'd need a neural net to optimize the packet size for these different types.
I also don't really understand why that makes it go so much faster. I'm sure you can conduct 'corner-case' tests where it makes a difference - but on the whole i can get file transfers to run at pretty near the lines
Re:You dont need much of a neural net anyways (Score:2)
The problem to solve is that depending on the amount of traffic and noise the optimal packet length to send varies. If you send a long packet and fail you have to resend all that data, even the data which you managed to get through.
If instead you send two packets then only the packet with the error would have to be resent. However smaller packets mean larger overhead from the protocols used. This overhead to packet length ratio is optimised depending on the level of noise
Re:Looks sketchy to me (Score:3, Insightful)
And people have been doing this before. The EE-times article mentions that. Apparently no-one has either not made so much progress or just not made so much of a fuss over it before. A quick search for "variable packet length and wireless" turns up quite a lot of results though. I'm fairly confident that you can find previous research in this area if you l
Re:Im no programmer, but... (Score:4, Informative)
And next time, please RTFA, m'kay?
Re:Im no programmer, but... (Score:5, Informative)
Re:Im no programmer, but... (Score:2)
Re:Im no programmer, but... (Score:2)
Re:Im no programmer, but... (Score:2)
The application optimizes the packet size, and passes the packet to the NIC. The NIC looks at the packet and realizes it is at an optimum size, so it does nothing. The packet is then passed onto the AP, and the AP looks at the packet size, again realizing that it is already at optimum size.
Now, lets assume that if the packet was at optimum size without the new software. Lets also assume that each pass though the so
Re:Im no programmer, but... (Score:5, Informative)
Now, a speed increase sounds good to me, but as most of my wifi usage is for internet use, and I don't think i've ever been on an internet connection faster than my wifi connection, I'd like to know if it helps with range, and I'm too lazy to RTFA.
Re:Im no programmer, but... (Score:5, Interesting)
Re:Im no programmer, but... (Score:2)
Re:Im no programmer, but... (Score:5, Informative)
So to say that it is all just "software" misses the fact that there is a significant difference between how these peices of software work. It is really cool that this can be done at the application layer, because it will allow applications to be developed to take advantage of it with out even changing the drivers for your wi-fi card.
Re:Im no programmer, but... (Score:2)
Resilient Overlay Networks (RON) (going on memory of exact name) can improve transfer rates and have other advantages over using just TCP/IP.
Good point, but bad example.
Re:Im no programmer, but... (Score:2)
But, what happens when product X that you totally depend on applies this at the application layer, and then product Y, which your business is "betting the farm" on, requires that it be applied at the protocol layer?
Does implementing it twice at different places in the software heap cause problems?
An example of
Re:Im no programmer, but... (Score:2)
In *theory*, packet size is determined at the protocol layer - the application layer should be unaware of what's happening at the protocol layer.
Yet, by being able to apply this at either layer could potentially break things.
For example, let's say that the optimum packet size is 410, but at the application layer it's determined to be 390.
RTFA - You'll find that the tolerance to get these kinds of gains is very slight.
Couldn'
Re:could be handy.. (Score:5, Insightful)
I'm not a network engineer, but latency is more important than bandwidth for ping times and such.
For an example pay a half-life game, open the console and type net_graph 3. That'll show you your fps, ping, and in/out bandwith used.
20ms ping? (Score:2)
but as a general rule wireless always adds 20ms to your ping
Where does that 20ms number come from. My personal WLAN has about 0.9ms to the router, 1.7ms to the access point, and 2.8ms to another computer which is only connected via WLAN. So make it 2 instead of 20, and I can agree with you.
I get a 15ms ping to the router on the other side of the ADSL line I use. Now if that could be cut in half...
Re:could be handy.. (Score:5, Interesting)
In other words, bandwidth will do you zero good with a traditional tcp/ip stack if your latency is too high.
Re:could be handy.. (Score:5, Informative)
Specifically, you want to allow a lot more packets to be outstanding then a normal TCP connection will allow. This is a bad idea on a low latency connection. It has something to do with windows, and buffering. Also, if you use advanced IP tools to ensure that ACK's get sent before anything else, you'll be much happier.
This thread on the LKML seems to have useful information on it: LKML Thread [iu.edu]
Kirby
Wireless net congestion... (Score:3, Insightful)
I think part of the ease of predictability may have a little to do with the kind of protocols used to collision detection/TDMA in congested 802.11 nets. If they are suffciently simple a single node could outmanuver the others.
Some questions...
What is the behavior of this algorithm as the number of enabled clients increases and the bandwidth demand of the clients exceeds the channel capacity? Does it degrade gracefully? Does it unfairl
Thought experiment. (Score:2)
> Now use this algorithm to predict the near future price of any given stock. I predict it won't work, and would make you poor in a hurry. Sorry folks, the future can't be predicted with 100% accuracy (or at least not consistently accurate enough to make money by day trading.)
Suppose you did have some kind of high-accuracy prediction algorithm, and everybody started using it. Then what happens?
Re:neural net (Score:2)