I have been trying to have the Nokia device on all week, but that was no use. Earlier today the filesystem on the device got corrupted again, this time the projects directory got affected! As a result I lost all of this weeks data, and some work I have done over the weekend which I thought I backed up to my desktop but it turns out no... Really annoying, due to resets(I wrote a script over the weekend to monitor the program and start it if its not running... and this was lost), sqlite data corruption(I think I have tracked down this bug today) I have not collected as much data as I like...
And Django progress is going slow too... STRESS.
Wednesday, 2 April 2008
Sunday, 30 March 2008
Behind Schedule, new plan
New Plan
I'm slightly behind schedule, I only started the location segmentation when it should have been completed! Now I don't really have much time for Bluetooth familiarity and presentation, and now have a new part to do too!
Here is a revised Gantt chart for the 4 major milestones I have to do:
Validation
As I already mentioned, I'll get my friends to use the Django web interface and adjust accord to their comments.
I'm also considering to giving the whole device to one of my friends for couple of days to get them to collect their data and see how well it works, and incorporate any changes they will make. As so far I have been the only person using the device.
Framework
The way the project is changing, is its becoming a framework of life log data. I'll develop two applications for an example, life log browser and context aware reminder.
Other potential uses for what I have developed so far could be:
That is all
This is most likely going to be the last post as the blogs will not be evaluated after tomorrow. Thanks to everyone who read this with my 'essay' type posts!
I'm slightly behind schedule, I only started the location segmentation when it should have been completed! Now I don't really have much time for Bluetooth familiarity and presentation, and now have a new part to do too!
Here is a revised Gantt chart for the 4 major milestones I have to do:
- Django UI: Web interface which allows you to navigate the collected data, working on this for the next few weeks.
Validation: During this time,I'll need to get some of my friends to evaluate it and adjust it according to their comments. - Event Notification: Enter the event you want to be reminded of on Nokia tablet or the web interface, and the event will be reminded. 2 weeks for this. I'll need to take a look libosso library in maemo for dbus notifications and figure out how beep sounds.
- Visualisation: Try to make some sort of interactive cut down version to navigate this data
- Finally: Documentation
Validation
As I already mentioned, I'll get my friends to use the Django web interface and adjust accord to their comments.
I'm also considering to giving the whole device to one of my friends for couple of days to get them to collect their data and see how well it works, and incorporate any changes they will make. As so far I have been the only person using the device.
Framework
The way the project is changing, is its becoming a framework of life log data. I'll develop two applications for an example, life log browser and context aware reminder.
Other potential uses for what I have developed so far could be:
- Automatically setting a mobile phone profile settings to silent with certain conditions, for example lecturers phone is detected or in DCU, and changing the profile back to normal when the location is left.
- Plug in for the likes of twitter which posts up the location every time it changes... or even a blog!
- If this system is implemented for everyone working in a certain office; if all the workers general location is transmitted to a central server, various things can be done with this:
- Working hours time tracking, instead of using swipe cards, manually logging.
- General people in the office availability - to show if not in the office at all today, or only gone out temporarily
- Similar to the previous point, the PBX could be integrated with this and if the person is not available, the call could be redirected to someone else or even the mobile phone of that person, without setting it up manually or letting it ring throughout for a certain amount of time.
- More to be added, when I think of them
That is all
This is most likely going to be the last post as the blogs will not be evaluated after tomorrow. Thanks to everyone who read this with my 'essay' type posts!
Friday, 28 March 2008
Meeting
Had a very helpful meeting today with my supervisor about the project.
We decided developing another use for what I have developed already, besides just the 'lifelog' history. What I thought about previously was add a Context Aware reminder, which beeps and reminds you when certain conditions are met, for example next time I'm with Alan in DCU, remind me to tell him something. The notes can be triggered by either/and a person(bluetooth device), location, date and time. Implementing this should not be to hard, but it is a lot more work on top of what I already have to do.
Basic diagram, and things discussed on the left side of the picture of the whiteboard:
Basically, the visualization of the data has been scaled down, but developing simple application to show the details by person and location, also simple interactive diagram which shows people and the locations they're seen(bottom left).
Top left shows the overall picture of the new proposed architecture, a todo list is entered on the web page and/or the device, its synced to the device. The notification is displayed when the conditions match. I already beggan looking at how I could implement this, and I have chosen Django framework for Python. Spent the last few hours playing around with Django and it seems fairly nice.
We decided developing another use for what I have developed already, besides just the 'lifelog' history. What I thought about previously was add a Context Aware reminder, which beeps and reminds you when certain conditions are met, for example next time I'm with Alan in DCU, remind me to tell him something. The notes can be triggered by either/and a person(bluetooth device), location, date and time. Implementing this should not be to hard, but it is a lot more work on top of what I already have to do.
Basic diagram, and things discussed on the left side of the picture of the whiteboard:
Basically, the visualization of the data has been scaled down, but developing simple application to show the details by person and location, also simple interactive diagram which shows people and the locations they're seen(bottom left).
Top left shows the overall picture of the new proposed architecture, a todo list is entered on the web page and/or the device, its synced to the device. The notification is displayed when the conditions match. I already beggan looking at how I could implement this, and I have chosen Django framework for Python. Spent the last few hours playing around with Django and it seems fairly nice.
Thursday, 27 March 2008
Nokia tablet problems
The Nokia tablet started to have fairly frequent problems of randomly resetting it self, it was always happening from the time I got the device, but over yesterday and today it became a lot more frequent, the reset is tole ratable, but what causes me problems is that the file system gets corrupted and I have to repair it(using fsck), even then this did not fix the problem today, it would boot up fine; show the desktop environment and then rebootafter few seconds... so I had to restore the original operating system that I have, but stupidly I forgot to backup the data collected and it was over written with the old data, I didn't realise until it was to late! Originally, I thought it was a weeks worth of data, but checking now it was only today and yesterday, so not to bad I guess, could have been worse!
Another problem I have noticed is that the logger program just dies randomly when run from inet.d script, but works fine for hours if run manually from command line.
Another problem I have noticed is that the logger program just dies randomly when run from inet.d script, but works fine for hours if run manually from command line.
Monday, 24 March 2008
Formatted data output, getting there!
Over the last while I cleaned up and restructured some of the code, which took ages, hit so many weird problems with PL/PSQL, and the error reporting is not very great - it has taken me upwards of an hour to spot some mistakes! So slow progress.
Now the code is pretty much ready to be used with a trigger when a new data is uploaded, I have done GPS segments to locations mapping now too (if a new location is found, it is added to the location list). The way I'm doing this is by getting the average of all the coordinates within that segment which have more than 3 satellites, and then trying to match this average coordinate to a previously detected location which is within 200meters, the only problem is that sometimes the average of coordinates is not accurate enough to detect the location accurately. Will have to work on this more.
Formated output, Finally!
Messed around with outputting some HTML of the analyzed data.
Formated results are available here. (GPS coordinates are slightly changed so I don't give out where I live on the internet...). To match DCU, the average coordinate must be within 800meters from the centre, and for home I made it slightly smaller, due to the high freaquancy of the locations visited and creating lots of locations within, especially with DCU and to reduce some errors.
Now some of the more interesting data I have collected which is not me doing Home -> DCU -> Home:
Typical day, leaving my house slightly early to make it for 9am Real-Time Embedded lecture, GPS must have run out of battery or the Nokia device must have reset it self, both are fairly typical.
The "From" and "To" are foreign data base keys to the original GPS data.
Turning the device on fairly late, driving in to town, parking my car near Merrion Square to go to a friends birthday party, GPS device and Nokia tablet left in the car.
New day, There till around 2:20, and then leaving for home... GPS running until battery runs out at 5am. Did not detect me giving a lift to friends besides DCU, where I stopped by for less than 5 min, neither it detected my filling up petrol. But these type of segments are very minute and not really considered significant.
This is slightly more interesting as there is more activity in here.
Leaving home, parking up my car in the estate besides DCU, driving to clontarf dart station, leaving the GPS in my car, going for orthodontist appointment and driving back to DCU. The last area is a bug in data normalization which I have to take a look at.
Interesting too, turning on the device in DCU, small travel period up close by to Griffth avenue, dropping down to a friends party for 2 hours and then driving back home, except during the traveling, GPS device ran out of battery, which lasts around 7 hours!
So I'm fairly happy with the output, besides the way I'm matching average GPS coordinate to an existing location... I'll have to find a better way of doing this if I have time... and need to hunt down the bug 2008-02-11.
Data collection problems
As previously mentioned, not getting time to charge the Bluetooth GPS device is a problem and only around 7 hours of data is collected when I don't get a chance to charge it.
I lost few days of data because there are times when GPS has not acquired signal after running out of battery, the time/date it reports is wrong... I had to delete around 2000 GPS coordinates because of this.
And me mainly being home mid December - February. I didn't realize I didn't collect so much data for February... Now I'm going try to collect data every day for the next few weeks.
Whats next
Fix the bug where I can't transfer the bluetooth logs to postgres database mentioned few months back, and start to think about how I'm going to let the user access this data nicely under Processing. And clean up a lot of things! So this basically is nearly the end of the 2nd stage of the project.
I'm having a meeting with my supervisor on Friday, so few new ideas might come in :). I'll try to post up the algorithm I'm using and a new revised Grant chart.
Now the code is pretty much ready to be used with a trigger when a new data is uploaded, I have done GPS segments to locations mapping now too (if a new location is found, it is added to the location list). The way I'm doing this is by getting the average of all the coordinates within that segment which have more than 3 satellites, and then trying to match this average coordinate to a previously detected location which is within 200meters, the only problem is that sometimes the average of coordinates is not accurate enough to detect the location accurately. Will have to work on this more.
Formated output, Finally!
Messed around with outputting some HTML of the analyzed data.
Formated results are available here. (GPS coordinates are slightly changed so I don't give out where I live on the internet...). To match DCU, the average coordinate must be within 800meters from the centre, and for home I made it slightly smaller, due to the high freaquancy of the locations visited and creating lots of locations within, especially with DCU and to reduce some errors.
Now some of the more interesting data I have collected which is not me doing Home -> DCU -> Home:
2008-03-14
segment | Start | End | From | To | Location |
83 | 07:40:17 | 07:53:41 | 31141 | 31186 | ID-2 name Home |
84 | 07:53:52 | 08:32:41 | 31187 | 31318 | travel |
85 | 08:32:41 | 13:12:50 | 31319 | 32216 | ID-1 name DCU |
Typical day, leaving my house slightly early to make it for 9am Real-Time Embedded lecture, GPS must have run out of battery or the Nokia device must have reset it self, both are fairly typical.
The "From" and "To" are foreign data base keys to the original GPS data.
2008-03-01
segment | Start | End | From | To | Location |
76 | 21:36:55 | 21:37:44 | 29789 | 29792 | travel |
77 | 21:37:57 | 21:46:18 | 29793 | 29820 | ID-2 name Home |
78 | 21:46:30 | 22:25:34 | 29821 | 29933 | travel |
79 | 22:25:47 | 23:59:55 | 29934 | 30204 | ID-16 name Merrion Square |
Turning the device on fairly late, driving in to town, parking my car near Merrion Square to go to a friends birthday party, GPS device and Nokia tablet left in the car.
2008-03-02
segment | Start | End | From | To | Location |
80 | 00:01:07 | 02:20:37 | 30207 | 30612 | ID-16 name Merrion Square |
81 | 02:20:49 | 03:31:04 | 30613 | 30826 | travel |
82 | 03:31:16 | 05:00:24 | 30827 | 31138 | ID-2 name Home |
New day, There till around 2:20, and then leaving for home... GPS running until battery runs out at 5am. Did not detect me giving a lift to friends besides DCU, where I stopped by for less than 5 min, neither it detected my filling up petrol. But these type of segments are very minute and not really considered significant.
2008-02-11
segment | Start | End | From | To | Location |
62 | 09:10:05 | 09:11:13 | 25709 | 25712 | ID-2 name Home |
63 | 09:12:01 | 09:39:52 | 25713 | 25800 | travel |
64 | 09:40:15 | 12:33:13 | 25801 | 26325 | ID-1 name DCU |
65 | 12:33:46 | 12:43:02 | 26326 | 26354 | travel |
66 | 12:43:25 | 13:10:47 | 26355 | 26442 | ID-13 name Clontarf Dart Station |
67 | 13:12:22 | 13:23:50 | 26443 | 26470 | travel |
68 | 13:24:02 | 21:05:07 | 26471 | 27825 | ID-18 name unknown |
This is slightly more interesting as there is more activity in here.
Leaving home, parking up my car in the estate besides DCU, driving to clontarf dart station, leaving the GPS in my car, going for orthodontist appointment and driving back to DCU. The last area is a bug in data normalization which I have to take a look at.
2008-02-08
segment | Start | End | From | To | Location |
58 | 16:52:52 | 20:58:09 | 24442 | 25153 | ID-1 name DCU |
59 | 20:58:22 | 21:10:29 | 25154 | 25199 | travel |
60 | 21:10:42 | 23:12:10 | 25200 | 25579 | ID-12 name Friends party |
61 | 23:12:32 | 23:49:57 | 25580 | 25706 | travel |
Interesting too, turning on the device in DCU, small travel period up close by to Griffth avenue, dropping down to a friends party for 2 hours and then driving back home, except during the traveling, GPS device ran out of battery, which lasts around 7 hours!
So I'm fairly happy with the output, besides the way I'm matching average GPS coordinate to an existing location... I'll have to find a better way of doing this if I have time... and need to hunt down the bug 2008-02-11.
Data collection problems
As previously mentioned, not getting time to charge the Bluetooth GPS device is a problem and only around 7 hours of data is collected when I don't get a chance to charge it.
I lost few days of data because there are times when GPS has not acquired signal after running out of battery, the time/date it reports is wrong... I had to delete around 2000 GPS coordinates because of this.
And me mainly being home mid December - February. I didn't realize I didn't collect so much data for February... Now I'm going try to collect data every day for the next few weeks.
Whats next
Fix the bug where I can't transfer the bluetooth logs to postgres database mentioned few months back, and start to think about how I'm going to let the user access this data nicely under Processing. And clean up a lot of things! So this basically is nearly the end of the 2nd stage of the project.
I'm having a meeting with my supervisor on Friday, so few new ideas might come in :). I'll try to post up the algorithm I'm using and a new revised Grant chart.
Thursday, 13 March 2008
Eurika: Location/Travel Detection works!
Visualizing Data Book Arrived
Had a scan through it, lots of example code to do things so good :).
Last two weeks
Last two weeks have been on the crazy side, especially last week. I was working on how to classify the location and travel from the raw GPS data, which ended up following this pattern :
Ended up making progress this week, after meeting with Daragh from CDVP and Aiden. They suggested windowing approach in trying to normalise the data that I had (which was one of my main problems, ignoring 5 min stop in traffic while travelling, and just counting it). More on this later.
It works!
The table showing the output for a certain day and the same location/travel boundaries. They numbers are keys to the raw GPS data when it starts and ends:
Running the algorithm(couple hundred lines in pl/sql):
The project blog deadline is 24th March, I'm away all of next week so I'm not sure I'll have time to update on this more. As I previously mentioned, checking how accurate is very time consuming(even more documenting it on the blog! this post took me well over an hour of constant working!) Still loads of work to be done and time is running out, especially with all the assignments that we're getting!
Had a scan through it, lots of example code to do things so good :).
Last two weeks
Last two weeks have been on the crazy side, especially last week. I was working on how to classify the location and travel from the raw GPS data, which ended up following this pattern :
- change some code/constant to determine the values for when location is changing/travel
- run the algorithm
- look at output and the actual database and see what was not detected, whats correct, etc. Repeat
Ended up making progress this week, after meeting with Daragh from CDVP and Aiden. They suggested windowing approach in trying to normalise the data that I had (which was one of my main problems, ignoring 5 min stop in traffic while travelling, and just counting it). More on this later.
It works!
The table showing the output for a certain day and the same location/travel boundaries. They numbers are keys to the raw GPS data when it starts and ends:
Running the algorithm(couple hundred lines in pl/sql):
- Starting off at my house, leaving my house at key id 6751. Looking at where this coordinate is at, its around 50-100meters from my house, less than a minutes drive.
- Taking a point close towards the end of travel, 6828, which is "53.47265, -6.20259"
- 6091, is me being in DCU' car park 2: 53.38712 ,-6.26106
- Checking few coordinates back, 6897 is 53.38604 , -6.25649.
I'm driving in to DCU. So where it detected the end of travel is accurate within couple of points! - 7707,53.38575 , -6.25719 me walking abouts DCU. Same location
- The rest of the time within that range, I'm around DCU area
- The few points beside the end of the transition, I'm around DCU car park. Its not till 8506 that I'm leaving DCU properly, but the fast speed detected the change fine.
- Now the moment of truth, will it detect me going to the Porter House north properly:
THe travel stage is detected as 6755 - 6901, which I have shown as its accurate for when I left DCU.
Showing GPS data for me leaving DCU and driving to Porter House North: - 8518, 53.36735, -6.27054, I have still not arrived.
- 8519, 53.36735 , -6.27055 , still not there
- 8520, 53.36547 , -6.27167, very close, but still on the road this is where the boundary got detected.
- 8524, arrived, and parked my car. The rest of the coordinates jump around few meters to each side due to GPS not being accurate. The detected stopped of travel was within 4 coordinates taken, out by 90 seconds! I'm happy with that if it works this well the rest of the time.
- Leaving the place 8868, again, within few coordinates of when I'm actually beginning to travel properly.
The project blog deadline is 24th March, I'm away all of next week so I'm not sure I'll have time to update on this more. As I previously mentioned, checking how accurate is very time consuming(even more documenting it on the blog! this post took me well over an hour of constant working!) Still loads of work to be done and time is running out, especially with all the assignments that we're getting!
Monday, 3 March 2008
More on GPS extraction
Spent all of today trying to mess around with a way to get rid of the errors in the GPS recordings when inside. This lead me to trying to read up on some neural nets but I left that idea as it would be a lot nicer if the functionally which extracts locations would be inside the database... it would take care of solving when to try to run that data(or i could just schedule it once a day if it goes that way).
Was messing around with PL/Python which allows me to write python code in PL/SQL... but it soon got messy(trying to access all the result columns and naming) and I checked out PL/PGSQL, which is basically Oracle' PL/SQL for postgres. But not really familiar with it, so I kept on making stupid mistakes... you cant name/alias results from a table as 'a'(it works fine in an SQL statement, but not in PL/PGSQL) - took me probably nearly an hour to chase down.
Didn't made much progress on this today and yesterday... I got another full planned day on trying to figure this out on Wednesday.
It turns out that the Visualizing Data book order was canceled because they could not fulfill the order at the current time... So I ordered it from amazon.
RSSI
Spent some of yesterday looking at RSSI for bluetooth devices, but didn't feel to great and got distracted to easy.
What I managed to get done is to display values using the example in pybluez. but I'm not to sure it works correctly:
Was messing around with PL/Python which allows me to write python code in PL/SQL... but it soon got messy(trying to access all the result columns and naming) and I checked out PL/PGSQL, which is basically Oracle' PL/SQL for postgres. But not really familiar with it, so I kept on making stupid mistakes... you cant name/alias results from a table as 'a'(it works fine in an SQL statement, but not in PL/PGSQL) - took me probably nearly an hour to chase down.
Didn't made much progress on this today and yesterday... I got another full planned day on trying to figure this out on Wednesday.
It turns out that the Visualizing Data book order was canceled because they could not fulfill the order at the current time... So I ordered it from amazon.
RSSI
Spent some of yesterday looking at RSSI for bluetooth devices, but didn't feel to great and got distracted to easy.
What I managed to get done is to display values using the example in pybluez. but I'm not to sure it works correctly:
- The values where high -30 for all the devices, no matter what the distance.
I can't understand the code, it seems to just to listen for the RSSI event somewhere inside the bluez internals, and format it somehow with the ba2str function... my guess it means bluetooth address to string. - I looked at the code of previously mentioned BlueProximity and it has a comment about pretty much exactly the same function as in the example in pybluez that it doesn't work... The way BlueProximity gets the RSSI value is by parsing the output of 'hcitool' command. I'm not sure how slower/use more power would this be if I implemented it searching for it every minute or less.
- Out of date name if used for a long time
- Sometimes the name reported is the name for a different device.
I see this fairly often with my Bluetooth GPS receiver... other devices showing up with its name.
Saturday, 1 March 2008
Meeting, Large errors in GPS readings
Meeting with Dr. Hyowon Lee 21st Feb
Had a meeting Hoywon about few ideas on how to implement the UI. Few good ideas, need to think about it further. I borrowed Envisioning Information book and skimmed it.
I also ordered Visualizing Data the same day, but it has not arrived yet... Hopefully Book Depository don't loose the book as they did for the previous book I ordered from them
The filesystem on Nokia device got corrupted(presumably when it ran out of battery), it would not boot properly. Luckily, this was on the MMC card and I could still boot the OS from the onboard flash and could run fsck(file system check) which showed up some corrupted/lost files, but luckily everything so far seems to work fine.
GPS Extraction
Didn't get much work done over the week and was away in Co. Clare over the last weekend, so spent all of today trying to figure out how to try to approach location extraction from the GPS logs.
After dusting off my SQL/PLSQL skills, I figured out how to calculate distance between 2 points taking account Earth' surface, using postgresql library, earthdistance. Took me ages to figure out why the results showed a large error from the one calculated by GpsVisualiser Calculator. I'm not to sure what I did in the end, but changing the lattitude/longitude type from 'REAL' to 'DOUBLE PRECISION' seemed to fix it.
With this, and more PL/SQL hacking, I came up with something which showed me distance between 2 points, and the rate of change with respect to time:
This shows my from the M1 motorway to Bank of Ireland, where I needed to wait for a while for the bank to open at 10am.
The fields are:
I hope most of the time, the GPS just gives the last known coordinate - i.e the one before the building was entered... Is what I understood from my previous testing.
I have to come up with a way to filter these erroneous coordinates out! and also try to see how can I use rate of change to detect location entering/leaving.
Will update further.
Update 1st March: Forgot to mention that the way I'll schedule the the location extraction is by having a special table which gets updated of what data got transferred from the Nokia tablet when the data is transferred, and that table will have a trigger to run execute a Python function which will do all the work. And then expose this data with some sort of SOAP/XML interface that the UI will be able to query and present.
Had a meeting Hoywon about few ideas on how to implement the UI. Few good ideas, need to think about it further. I borrowed Envisioning Information book and skimmed it.
I also ordered Visualizing Data the same day, but it has not arrived yet... Hopefully Book Depository don't loose the book as they did for the previous book I ordered from them
The filesystem on Nokia device got corrupted(presumably when it ran out of battery), it would not boot properly. Luckily, this was on the MMC card and I could still boot the OS from the onboard flash and could run fsck(file system check) which showed up some corrupted/lost files, but luckily everything so far seems to work fine.
GPS Extraction
Didn't get much work done over the week and was away in Co. Clare over the last weekend, so spent all of today trying to figure out how to try to approach location extraction from the GPS logs.
After dusting off my SQL/PLSQL skills, I figured out how to calculate distance between 2 points taking account Earth' surface, using postgresql library, earthdistance. Took me ages to figure out why the results showed a large error from the one calculated by GpsVisualiser Calculator. I'm not to sure what I did in the end, but changing the lattitude/longitude type from 'REAL' to 'DOUBLE PRECISION' seemed to fix it.
With this, and more PL/SQL hacking, I came up with something which showed me distance between 2 points, and the rate of change with respect to time:
This shows my from the M1 motorway to Bank of Ireland, where I needed to wait for a while for the bank to open at 10am.
The fields are:
- Latitude, Longitude of 1st coordinate
- Latitude, Longitude of the following coordinate
- Average number of satellites detected between the 2 readings
- Average speed between the 2 readings
- Date and time of the 1st reading
- Distance between the 2 points with respect to Earth in meters
- Time in seconds between when the 2 readings where taken
- Rate of change, in Meters per second
- I seem to be to have parked my car in one of the estates close by to DCU.
- Then walked to middle of DCU
- Within few seconds, I seem to end up 14km somewhere in the southside!
- Within 3 minutes, someone teleports me on M50 near Whitechurch
- Within 10 minutes, I end up somewhere in Kildare.
- Then somewhere besides M50 besides Finglas(the last coordinate on the screenshot)
- After this, somewhere in the Irish sea.
- Then in DCU sometime before 11.
I hope most of the time, the GPS just gives the last known coordinate - i.e the one before the building was entered... Is what I understood from my previous testing.
I have to come up with a way to filter these erroneous coordinates out! and also try to see how can I use rate of change to detect location entering/leaving.
Will update further.
Update 1st March: Forgot to mention that the way I'll schedule the the location extraction is by having a special table which gets updated of what data got transferred from the Nokia tablet when the data is transferred, and that table will have a trigger to run execute a Python function which will do all the work. And then expose this data with some sort of SOAP/XML interface that the UI will be able to query and present.
Subscribe to:
Posts (Atom)