Well, it's mainly decided by the inputs and outputs I have coming. I have about ten including proximity, flex, temperature (as input sensors) and the usual outputs (mainly light and noise and power to servos). What I'm missing and I want to replicate are GPS, image recognition, compass, accelerometer and outputs to drive multiple wheels or rotors (I'd ideally like something that can fly).
Pet project given what I have is to build a 3D graph of the temperature in my living room and then watch it change when I turn on various devices (in particular my space heater). I'd like to get it as a video. The interesting bit isn't the graph but about building something that can navigate in 3D, be aware of where it is, and where it needs to be.
Probably a bit ambitious. If I end up with a home made remote-control media server which plugs into the TV and winks LEDs at me saucily to tell me certain things, I'll be happy with that too.
If it's navigating in 3D and you're working with temperature and time as well, wouldn't that need to be a 5 dimensional graph? Or am I misunderstanding?
Anyway, I would actually be really interested in that graph, but I figured it wasn't the primary objective, since I doubt that would be the easiest or most precise way to do it. Now I'm trying to think of how I would do it, though, and it's an interesting problem. I know very little about robotics (more than most people, but that's really not saying much), and most of my background is in astronomy and astrophysics, so my approach would be very different. My first instinct is thermal imaging using a series of cameras in different positions, because they could take images simultaneously. But that poses its own problems (like exposure time and getting images to calculate accurate cross sections without cameras obscuring each other, although there might be an obvious solution to that second one that my brain is just refusing to produce at the moment), and I'm not thinking clearly enough at 4 AM to figure out exactly how it would work.
Actually, even once you have the data you need, I'm not really sure how to represent it in a useful way. If we're looking at a single cross section, it's fairly easy, but it seems like it would be difficult to graph in a way that makes much of anything apparent that couldn't be achieved just as easily with a table. Again, though, it's past 4 AM and I haven't slept, so I might look at this later and realise that I'm being stupid.
Ahh, I'm ahead of you here. When I first learned to program computers it was precisely to create graphs like this. The third dimension (depth) can be represented by false 3D (like a transparent wireframe grid). The fourth dimension (temperature) will be colour. Blue cold, red warm. And the fifth dimension (time) will be represented by it being a video.
Chances are you've already seen something like this. I could do it a lot easier (and better) with a thermal imaging camera, but where's the fun in that? Plus, my way is true 3D as opposed to representation of 3D on a CCD. I'll be able to move around inside my model in a way you can't with traditional video.
I considered the colour and motion, but I can't… my brain won't make words to express the concern I have. I've made quite a few graphs like you described, but for some reason there's a problem that I'm having a really hard time expressing right now, which might be because my fatigued brain has fabricated it.
I don't think I'm going to succeed in getting the thing to fly but I think I'll manage to get it moving around on the floor so I might do a flat plane colour chart with video. I'll stick it on YouTube if I ever do it.
The gist of the problem is that I can see a 3D graph working for robot that takes data over a fairly limited path, but I don't understand how you could represent the whole room without obscuring a huge amount of your data, unless you separate it into cross sections .
I spent yesterday playing with it and sampling. The processor is a bit weedy compared to today's stuff but I managed to get 50,000 samples per second by efficient programming. However, there's a problem. I only have 32k of memory to work with and 1.5k is used by program.
Even if I use a single byte for temperature, x, y, and z, that's four bytes per sample meaning 7750 samples max. If I want one sample set per minute for an hour that means little over a hundred sample points. Cube root of that is under five. In a (near) triangular room with opposite and adjacent about twenty feet each, each sample sits inside an area of twenty cubic feet.
So, change of plan. I've got one of those white RGB LEDs which I've stuck on the breadboard and with pulse-width modulation, I've managed to get it displaying a full range of colours. Problem is, I can't hang the pulsing off an interrupt because even though the PIO will handle it, the clock is too slow. I've got to use every cycle in an infinite loop to pulse, which means I'm also writing my own interrupt structure to sample temperature once every 29,791 cycles. When it finally works, I'll have a semi-instant colour display of local temperature which I'll move manually (i.e. by hand) and then build up my own mental image. My temperature sensor raises 10mV per degree celcius so I may kick out a byte with eight normal LEDs logical ANDed with a value of 1, 2, 4, 8, 16, 32, 64, and 128 respectively. If I assume a maximum range of 32°C in my place, I can get readings down to an eighth of a degree.
Either way, I'm having so much fun, I'm neglecting my other toys. Even the keyboard which, itself, is amazing.
Do you trust your girlfriend/boyfriend?
↑ View this comment's parent
← View full post
Well, it's mainly decided by the inputs and outputs I have coming. I have about ten including proximity, flex, temperature (as input sensors) and the usual outputs (mainly light and noise and power to servos). What I'm missing and I want to replicate are GPS, image recognition, compass, accelerometer and outputs to drive multiple wheels or rotors (I'd ideally like something that can fly).
Pet project given what I have is to build a 3D graph of the temperature in my living room and then watch it change when I turn on various devices (in particular my space heater). I'd like to get it as a video. The interesting bit isn't the graph but about building something that can navigate in 3D, be aware of where it is, and where it needs to be.
Probably a bit ambitious. If I end up with a home made remote-control media server which plugs into the TV and winks LEDs at me saucily to tell me certain things, I'll be happy with that too.
--
VioletTrees
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
If it's navigating in 3D and you're working with temperature and time as well, wouldn't that need to be a 5 dimensional graph? Or am I misunderstanding?
Anyway, I would actually be really interested in that graph, but I figured it wasn't the primary objective, since I doubt that would be the easiest or most precise way to do it. Now I'm trying to think of how I would do it, though, and it's an interesting problem. I know very little about robotics (more than most people, but that's really not saying much), and most of my background is in astronomy and astrophysics, so my approach would be very different. My first instinct is thermal imaging using a series of cameras in different positions, because they could take images simultaneously. But that poses its own problems (like exposure time and getting images to calculate accurate cross sections without cameras obscuring each other, although there might be an obvious solution to that second one that my brain is just refusing to produce at the moment), and I'm not thinking clearly enough at 4 AM to figure out exactly how it would work.
Actually, even once you have the data you need, I'm not really sure how to represent it in a useful way. If we're looking at a single cross section, it's fairly easy, but it seems like it would be difficult to graph in a way that makes much of anything apparent that couldn't be achieved just as easily with a table. Again, though, it's past 4 AM and I haven't slept, so I might look at this later and realise that I'm being stupid.
--
dappled
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
Ahh, I'm ahead of you here. When I first learned to program computers it was precisely to create graphs like this. The third dimension (depth) can be represented by false 3D (like a transparent wireframe grid). The fourth dimension (temperature) will be colour. Blue cold, red warm. And the fifth dimension (time) will be represented by it being a video.
Chances are you've already seen something like this. I could do it a lot easier (and better) with a thermal imaging camera, but where's the fun in that? Plus, my way is true 3D as opposed to representation of 3D on a CCD. I'll be able to move around inside my model in a way you can't with traditional video.
--
VioletTrees
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
-
VioletTrees
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
Ok, I got some sleep, but I think I've got to draw some diagrams to explain what I'm talking about properly.
--
VioletTrees
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
Oh wait, I can't right now. Fuck. I'll figure something out.
I considered the colour and motion, but I can't… my brain won't make words to express the concern I have. I've made quite a few graphs like you described, but for some reason there's a problem that I'm having a really hard time expressing right now, which might be because my fatigued brain has fabricated it.
--
dappled
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
I don't think I'm going to succeed in getting the thing to fly but I think I'll manage to get it moving around on the floor so I might do a flat plane colour chart with video. I'll stick it on YouTube if I ever do it.
--
VioletTrees
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
The gist of the problem is that I can see a 3D graph working for robot that takes data over a fairly limited path, but I don't understand how you could represent the whole room without obscuring a huge amount of your data, unless you separate it into cross sections .
--
dappled
10 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
See More Comments =>
I spent yesterday playing with it and sampling. The processor is a bit weedy compared to today's stuff but I managed to get 50,000 samples per second by efficient programming. However, there's a problem. I only have 32k of memory to work with and 1.5k is used by program.
Even if I use a single byte for temperature, x, y, and z, that's four bytes per sample meaning 7750 samples max. If I want one sample set per minute for an hour that means little over a hundred sample points. Cube root of that is under five. In a (near) triangular room with opposite and adjacent about twenty feet each, each sample sits inside an area of twenty cubic feet.
So, change of plan. I've got one of those white RGB LEDs which I've stuck on the breadboard and with pulse-width modulation, I've managed to get it displaying a full range of colours. Problem is, I can't hang the pulsing off an interrupt because even though the PIO will handle it, the clock is too slow. I've got to use every cycle in an infinite loop to pulse, which means I'm also writing my own interrupt structure to sample temperature once every 29,791 cycles. When it finally works, I'll have a semi-instant colour display of local temperature which I'll move manually (i.e. by hand) and then build up my own mental image. My temperature sensor raises 10mV per degree celcius so I may kick out a byte with eight normal LEDs logical ANDed with a value of 1, 2, 4, 8, 16, 32, 64, and 128 respectively. If I assume a maximum range of 32°C in my place, I can get readings down to an eighth of a degree.
Either way, I'm having so much fun, I'm neglecting my other toys. Even the keyboard which, itself, is amazing.