The Google Summer of Code for 2011 is over now for me. The final state of the UVC driver project while very far from perfect is at least at a point where incremental improvement can be made. Literally the day (maybe 2 days, depending which timezone you’re in) before the “firm pencils down” date I finally managed to get data all the way from the camera to the screen. The decoding of that information is totally wrong at this point, but coloured pixels show up on the screen and they appear to react when things move in front of the camera. Success?!
Hello all,
GSOC 2011 is over, and the SDL 1.3 for Haiku is over- for now.
I intend to continue working on the project, although I probably won’t start again for a while, as the recent errors have been frustrating and I need to relax a bit.
The almost-most-recent-version is available at https://bitbucket.org/antifinidictor/haiku-sdl-1.3/; I had some problems with my computer and haven’t been able to upload the most recent version yet, which just has some changes to which functions are static and which aren’t.
The following classes have been implemented; some methods and functions have not been implemented due to dependencies on unimplemented classes, but the classes below are otherwise complete:
| From the Application Kit: | From the Interface Kit: | From the Storage Kit: | From the Support Kit: |
| Application
Clipboard
Cursor
Handler
Invoker
Looper
Message
Messenger | Alert
Box
Button
CheckBox
ColorControl
Control
Font
Menu
ListItem
ListView
MenuBar
MenuField
MenuItem
OutlineListView
Picture
PictureButton
Point
Polygon
PopUpMenu
RadioButton
Rect
Screen
ScrollBar
ScrollView
SeparatorItem
Shape
Slider
StatusBar
StringItem
StringView
TabView
TextControl
TextView
View
Window | Entry
EntryList
FindDirectory*
Mime
MimeType
Node
NodeInfo
NodeMonitor*
Path
Query
Statable
Volume
VolumeRoster | Archivable
Beep*
Errors*
TypeConstants* |
*These don’t actually contains any classes, Errors and TypeConstants expose constants; Beep exposes functions; FindDirectory and NodeMonitor expose constants and functions.
In these weeks i have improved the contacts kit core in order to have enough support for the formats supported. The vcard and people translators can now translate and exchange many types of field, though photo and groups aren’t yet supported.
Main functionalities of the classes :
BRawContact
Their functionality is to deal with the BTranslatorRoster and keep track of basic informations, like the final format. The final destination is represented as a BPositionIO object. Basically the class find a suitable translator when initialized and provide two methods : Commit() and Read(). The first has as argument a BMessage that contain Flattened BContactFields, the second translate the source file (that can be a B_PEOPLE_FORMAT, B_VCARD_FORMAT, B_CONTACT_FORMAT) into a BMessage and return it.
In these last few (official) weeks of Google Summer of Code I’m focusing on the meat of my project. This means that side features like the achievements, scoreboard etc will be ‘frozen’ as-is until after GSOC. I’m planning on rounding them out, just not yet. The main work will be on the builddrone working properly and testing/signing. A major but was in the camlistore python library, I’ve fixed it and will change how it works a little.
The following classes are now mostly implemented; there are some methods that cannot be implemented yet because they require objects that are not yet implemented, but otherwise these classes are complete.
| From the Application Kit: | From the Interface Kit: | From the Support Kit: |
Application Clipboard Cursor Handler Invoker Looper Message | Alert Box Button CheckBox Control Font Menu ListItem ListView MenuBar MenuField MenuItem OutlineListView Point PopUpMenu RadioButton Rect Screen ScrollBar ScrollView SeparatorItem StringItem StringView TabView TextControl TextView View Window | Archivable Errors* |
*Errors doesn’t actually have any classes, it merely exposes a lot of constants to the target language.
Briefly, my goals for the three quarter term were: port libzfs, port the commandline tools zfs and zpool, and write a kernel module to communicate with userland tools via ioctl() calls on a /dev/zfs. Another goal was to make sure our port of ZFS passes all tests in ztest.
With the exception of a few missing routines, libzfs builds fine on Haiku. So does zpool. zfs requires some love, but nothing major remains to be done. In fact, with the exception of a few routines that I need to implement in libsolcompat (our Solaris compatibility library), the port builds almost perfectly on Haiku. But getting it to build is only half the battle ;)
So far SDL 1.3 for Haiku has made significant progress. Video draws correctly both with and without opengl, audio appears to already work, and various tests provided in the SDL test suite seem to work. However, there are a few significant bugs I have come across.
The first error occurs when resizing the window. The application occasionally receives the illegal operation signal or a SEGFAULT. The illegal signal operations occurred when blitting from the backbuffer I allocated to the screenbuffer provided by BDirectWindow’s DirectConnected() function. Presumably this was caused by the window being resized in the middle of a draw operation, since this error only occurred after I moved blitting to a separate thread. Before this, blitting was done in the application thread, and caused a slowdown of SDL’s event handling by up to 1 second (moving around the mouse required redrawing the window). I received several suggestions to fix this error, including move the blit code back to the application thread, use mutexes, and use BBitmaps. I have since transferred drawing operations to a BBitmap object, which appears to have removed the illegal operation signal. However, resizing the window continuously will result in the occasional SEGFAULT. I only discovered this error today; I intend to investigate it further over the weekend. So far I have noted that the SEGFAULT occurs in different places on different runs, although I have not officially found a connecting pattern.
New status report:
major feature dropped; bugs fixed;
did some screen research.
At the start of the third term, it was pointed out to me that Haiku does not actually support hardware 3D acceleration, and to add it would be a larger project than I have the time (or knowledge) for. Therefore, I’ve had to drop host-accelerated OpenGL from the planned features. I’m somewhat annoyed by this, but looking back it was probably a bit too ambitious anyway, and I’m not convinced I could have finished it in time.
Not so long ago, at the half way mark of the GSoc, I was optimistic that I was near to actually interpretting data from the camera in such a way as to produce images on screen. I was successfully grabbing payload data from the camera, the camera’s in-use light was on, things were looking good. Since that point, progress has been repeatedly stalled by strange and difficult to debug behaviour.