Teaching Open Source Planet
Teaching Open Source Planet is a Planet, a collection of personal blogs by Teaching Open Source community members working to bring the open source way into academia. We write about our inspirations and experiences in learning, teaching, and collaborating within free and open communities. In the spirit of freedom, we share and criticize in order to collectively improve. We hope you enjoy reading our thoughts; if you’re fascinated by what you see, consider adding your voice to the conversation.
In November OSS Watch travelled to Altlenigen near Mannheim in Germany to run a 2-day workshop for the TYPO3 community as part of their Marketing Sprint Week.
Across the 2 days, Scott Wilson and I presented sessions on the varieties of communities and why we form them, communication within online communities, governance of free and open source software projects, leadership, and conflict resolution.
While we were only able to have a small group from the vast community of TYPO3 at the workshop, those who did attend represented a range of teams from the community, including developers from both the TYPO3 CMS and TYPO3 Neos projects, as well as members of the marketing team and the community manager, Ben van ‘t Ende.
One of the great things to see was how open and honest the attendees were about the issues we discussed and the challenges they faced. A few points were of particular interest to me.
When we discussed the reasons we form communities, there was no clear agreement on what the shared interest of the TYPO3 community was. Defining this will be key to driving towards the community’s common goals in the future.
The community uses a myriad selection of communication channels, and the purpose of each isn’t always clear-cut. There’s also been a general lack of moderation culture, which has led to a few poisonous people getting out of hand. Instilling a sense of shared values and leading by example is needed over time to help ensure that discussions remain constructive.
There is a visible lack of diversity in the community, both shallow-level (most contributors are white, male and located in Germany) and deep-level (there are lots of highly skilled developers, but less who are learning or from other disciplines). These issues could affect the long-term sustainability of the community if the barriers for new, less-skilled contributors are too high. Engagement with users and less technical members of the community will also be key to shaping the community’s goals.
These problems aren’t the kind that will be fixed by a two-day workshop or a change of policy, it’s going to take commitment and leadership from those who believe in the community to move things forward. One thing that we definitely saw from this workshop is that those people are present and highly active in the TYPO3 community. We look forward to the possibility of working together again.
Our presentation slides from the workshop can be found on SlideShare.
A special thanks to Christian Händel for making sure we made it to Altlenigen and back!
I learned today that Nelson Mandela passed away, and this is surely a sad time for people around the globe because Mandela was a very passionate individual who had a very positive impact on so many people.
The Ubuntu Linux Project has its roots in a word he so eloquently explained in the video below and in fact Ubuntu at one point shipped with a copy of the video (Link for Planet Readers):
I think there is much for us to all learn from the life that Mandela led,and I hope this his spirit stays strong in the many people and projects he has inspired. Wouldn’t it be nice if Ubuntu 14.04 LTS shipped with a copy of the original video?
Aha! This is the best version so far of my criteria for Radically Transparent Research (website really needs rewriting; it’s currently a mess of writing from my first year of grad school), which is basically “a methodology for producing Free Scholarship.” I know, I know, there all tons of Open Research movements and projects out there; I’m trying to write an examination right now and will loop back and check in with them again after I pass it, ok?
- The work is public and freely accessible.
- The artifacts (data, analysis, etc.) used to create the work is also public and freely accessible so that it can be studied and peer-reviewed by communities of practitioners.
- The work and its artifacts can be freely modified and distributed so others in these communities can benefit from and build atop it.
“Chua’s 3 criteria for RTR” (or whatever less-silly name I can tack on that later) comes directly from a Free Culture + Academia mashup. From academic-land (specifically, the scholarship of teaching and learning), we have Lee Shulman’s 3 criteria for scholarship (paraphrased):
- It is public
- It is peer-reviewed by the practitioner’s community
- It can be used by that community as a stepping-stone towards future work
From Free Culture, Richard Stallman’s 4 criteria for software Freedom:
- The freedom to run the program, for any purpose.
- The freedom to study how the program works, and change it so it does your computing as you wish. Access to the source code is a precondition for this.
- The freedom to redistribute copies so you can help your neighbor.
- The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.
See the connections between Shulman and Stallman? I don’t imagine this is the final or best statement ever (and look forward to seeing what future version comes up), but it sounds pretty good to me right now.
The concept of a software patch is well understood by users of version control systems such as CVS and Subversion, but with the recent rise in popularity of distributed version control systems (DVCSs) such as Git, a new method for submitting changes to a project has evolved: the pull request.
While not a feature of DVCS software, pull requests are a common workflow used by projects that manage their code with a DVCS, so knowing what they are and how to do them is important if you’re planning to contribute to such a project. You can find out more in our new briefing note, What Is A Pull Request?
Recently OSS Watch were invited to produce a video for an online course discussing how to get involved in open source software projects. In the video, we discuss why you’d want to contribute to open source projects, how to choose the right open source project for you, the communication channels used by open source communities, how to make your first contributions and what you can contribute other than code, as well as sharing a little of our own experiences contributing to open source projects.
Click here to view the embedded video.
To view the video on HD, click the YouTube icon to view it there.
Last month we officially kicked off a new EU project aimed at getting students involved in free and open source software projects.
Inspired by Google Summer of Code, the idea is to enable students to obtain academic recognition for their contributions to open software communities.
The project itself is called VALS, and will be running the first pilot Semester of Code programme in 2014, and will be using the same Melange platform used to run GSoC.
For this to work in practice we need to have an effective model for co-mentoring, with communication between the student, their academic supervisor, and mentors from the software community.
We also need to put in place assessment and curriculum changes at universities to support experiential learning by students in open development. For example, universities need to recognise and assess not just what students produce in terms of contributed artefacts (such as code and documentation) but also the processes of engagement and communication with the community, and developing their experiences in using issue tracking and source control systems.
Some of the common problems we’ve seen with other mentoring programmes have been a lack of commitment from students in actually completing their contribution, and breakdowns of communications between students and mentors, and these are some of the things we need to look at in how the programme is designed. For example, we need to make it clear how issues can be resolved, and also to ensure that all parties involved have appropriate expectations.
I think for open source communities its also critical that we look at how engagement by students can be sustained beyond a single contribution to become an ongoing part of their professional development, at university and beyond.
The project team includes OSS Watch, OpenDirective, RayCom, MindShock, and the universities of Salamanca, Udine, Bolton and Cyprus, though the programme itself is going to be open.
As the project has only just started we don’t have a website yet, but If your community or university is interested in participating in the pilot programme next year, let us know at email@example.com.
This year I’m dishing up my advice for holiday gifts and each and every product suggested here is something I own or have had a decent amount of time to play with. I will try to highlight a few gifts that will be perfect for just about anyone.
Gifts for Travelers
1. Satechi Smart Travel Router + Charger
The Satechi Smart Travel Router + Charger is a really unique gadget because not only does it allow you to charge devices regardless of what continent you happen to be on but it also serves as a wireless router. The Satechi Travel Charge + Router is so small it will fit in your palm which makes it perfect for travelers who are conscious about how much gear they are packing with them.
I know this device is going to come in handy in my upcoming travels to Europe and I have tested it at home on standard American outlets.
2. OGIO Gambit 17″ Laptop Backpack
The OGIO Gambit 17″ Laptop Backpack is my go to backpack when I travel, and I use it as a carry on so I can keep my tablet, DSLR, Macbook and other gear secure and within arm’s reach. I have owned a number of high-end backpacks, but nothing has come close to OGIO’s Gambit Backpack.
The quality of the design and comfort under a full load plus all the space and protection for my gear makes this the only backpack I’ll trust.
3. Anker Astro 3E 10000 mAh External Battery
I have been a big fan of Anker’s line up of gadgets for some timebut my favorite product is their Astro 3E 10000 mAh External Battery. The Anker offers me multiple recharges for my smartphone whenever I’m on the go and don’t have access to a power outlet. I keep this External Battery on hand whenever I’m playing Ingress or flying somewhere.
4. ThinkTank Speed Racer V2.0 DSLR Bag
If you know me well, then you would know my DSLR Camera is almost always hanging over my shoulder in a ThinkTank Speed Racer V2.0 DSLR Bag.
This speed bag not only has cushioning to protect my Nikon from a drop but also offers up plentiful space for extra lenses, cleaning brush and other tools of the photography trade. I’m impressed that ThinkTank went so far as to include a rain cover which lets me go out in the elements and grab shots of weather.
Gifts for Geeks
1. Quirky Pivot Power
I have been a big fan of the Quirky Pivot Power ever since Ben Kaufman the CEO of Quirky sent me one to check out. I had originally heard about Quirky and the Pivot Power from fellow geek Joe Tech who was actually a co-inventor of the Pivot Power.
Anyways, this power strip is like no other because it really flexes to fit almost any use. I have one in my bedroom and office and really enjoy the innovative design.
2. Classic Beard Head
Horse Masks are so last year but if you really want to impress your friends you will take it nordic and sport a Classic Beard Head.
The folks at Beard Head sent me there Viking and classic beard heads earlier this yearand they have brought plenty of enjoyment over the months.
3. Fitbit Flex
These days Fitness bands are coming out every couple of months, but Fitbit still reigns in as king of fitness accessories. I own the Fitbit Flex and have had pretty great success with it and enjoy challenging my friends to getting fit together.
I have had the Fitbit Zip but really didn’t like losing it in the washer so switched to the Fitbit Flex which tracks sleep, steps and has a built in alarm. One thing I like most is I can choose any color and actually have a Ubuntu Orange band.
4. KeyPort Slide 2.0
The KeyPort Slide 2.0 is the best gadget of the year in my book. It really disrupts something most folks don’t give much thought to that being the keys to our home and cars. The KeyPort Slide 2.0 allows you to keep all of your keys into an organized and neat form factor.
One thing I like most of all is that the KeyPort Slide 2.0 supports USB Thumbdrive insert which means I can keep all my important files in my pocket.
Just a few months ago the folks at Foldable.me sent me some fan mail. They sent me a miniature foldable.me which is essentially a cut out that you assemble into 3D form which becomes a miniature cartoon version of me.
Anyways Foldable.me is a pretty affordable and entertaining gift that just about anyone will get enjoy.
Well that is all for now but I will be updating this 2013 Holiday Gift Guide over the coming week or two with more excellent gift ideas and new categories.
Over the weekend I started playing Doom 3 BFG Edition, I re-release of the mid-00s first person shooter. The reason I’m talking about this here is, as we’ve discussed before, id software who make Doom 3 have a policy of open sourcing the code for their games.
Doom Disks by Pelle Wessman CC By-SA
Doom 3 and the BFG Edition are no different in this regard, both being open sourced and the original Doom 3 even receiving an official Linux port. However, id never ported the BFG Edition to Linux.
Predictably, this hasn’t posed a problem for the open source community, and bit of googling turned up a Github fork of id’s code with support for Linux. The code relies on the SDL and OpenAL libraries to handle input and audio respectively, but once those dependencies were installed, it compiled for me on Ubuntu 13.10 and 12.04 LTS with no problems.
Alongside the resulting binary, to actually play the game requires the commercial data files, which aren’t distributed freely. Since the game’s distributed using the Steam DRM system, you need to install a copy of the Windows Steam client, install the game, then copy the files in place.
It’s possible to install the game files using WINE, but I was using a laptop which happened to be dual-booting Windows so I installed the game as normal on Windows, then switched to Linux and created a symbolic link to the data files on the Windows disk partition.
There’s a few caveats to note about the open-source version of this game. Firstly, trying to run the game with AMD graphics caused the game to crash with OpenGL errors. Reading bug reports shows that this may be a problem with driver compatibility (some people have gotten it working), but using a system with NVidia graphics worked flawlessly. The game also uses a couple of non-free components which can’t be included in the GPL code: the Bink video codec and the “Carmack’s Reverse” shadow stencilling technology licensed from Creative. This means that the odd in-game video is missing, although this doesn’t really detract from the game-play as the audio still plays.
The ease with which I was able to find a solution to play this unsupported Windows game natively on Linux is a real testament to the open source community’s ability and willingness to solve and share solutions to problems. I was also really impressed by how well the game ran under these circumstances, sowing how bright a future Linux has as a gaming platform.
Inspecting Oregon Sphagnum Moss (Eurhynchium oreganum)
Today I had an opportunity to make a trip down to Tryon Creek State Park in Southwest Portland. I was hoping to get some photos of some mushrooms growing wildlybut most of what I found was not in a condition worth taking a photo.
I did however stumble across lots of Sphagnum Moss specifically Oregon Sphagnum Moss which is one of 120 species of moss. Spaghnum Moss mostly occurs in the Northern Hemisphere while Oregon Spaghnum Moss only occurs in Oregon and Washington, and there is actually an entire industry around production of moss which is used in a variety of consumer (Compost, Soil Mixes, Art, etc) and commercial products.
I really did not know much about Moss until a few months ago when I decided to read a couple books on Moss, Lichen and Algaes since I’m trying to become knowledgeable about organic matter growing on trees since I started volunteering for a local non-profit that plants trees.
Anyways it is pretty amazing how much there is to moss and how it grows and stores water and how many colors and species exist and how it impacts trees and can help you determine things like which side of a tree sunlight reaches.
If you are out in the woods I encourage you to have a closer look at how many species of moss, lichen and algae you can find in a single forest. Be careful though when handling this species with a naked hand since they can give you a rare fungal infection called Sporotrichosis if you have a scratch or cut on your hand and it is possible to be infected via inhalation.
You can check out more of my photos from Tryon Creek State Park right here which are available under the Creative Commons.
I've packaged the quick2wire python3 library for the Raspberry Pi. This provides easy access to the i2c peripheral bus from Python3; I've packaged this up because I need it, and also to test and demo the package review process for Pidora.
Here's a little demo of the quick2wire library in action which I wrote some time back and have been using as a test for the package -- this reads a TCN75A (data sheet) thermal sensor chip:
# test_tcn75a :: Test of reading a TCN75A digital
# temperature sensor using I2C
# - TCN75A is powered at 3v3
# - I2C lines connected to Raspi GPIO
# - Pins 5/6/7 are grounded (address = 72)
# - quick2wire Python library
# CTyler 2012-10-03 - GPLv2+
# Using the quick2wire module for I2C access
import quick2wire.i2c as i2c
# Using the time module for sleeping
# Address (unit number) of the TCN75A temperature
# sensor on the I2C bus
address = 72
# Register number within the TCN75A that contains
# the current temperature
temp_register = 0
# Register number within the TCN75A that contains
# the configuration register
conf_register = 1
with i2c.I2CMaster() as bus:
# Configure the resolution (optional step)
# The configuration register is used to set the temperature
# resolution. The higher the resolution, the more
# accurate the temperature reading, but the lower the
# sampling rate. Possible values are 0, 32, 64, and 96.
# Value 96 = 0.01625C steps (highest resolution)
# Value 0 = 0.5C steps (lowest resolution) (default)
i2c.writing_bytes(address, conf_register, 96))
# Loop 100 times
for i in range(1,100):
# Select the address (unit on the bus) and desired
# register, and read 2 bytes
read_temp = bus.transaction(
# The first byte contains the temperature in degrees
# Celsius (actually, this is a signed number, so
# values over 127 are negative, but I'm ignoring
# that here). The second byte contains 256ths of a
# degress, but the default resolution of the sensor
# is 0.5 degrees, so it will always be 0 (.0) or 128 (0.5)
# unless the resolution is changed.
# This line converts the two bytes into a single
# temperature and prints it.
print("Temperature: %3.3f°C" % (read_temp+read_temp/256) )
# Delay half a second before getting next reading
The package is up for review in Pidora (not Fedora, but only because it's not useful on other platforms -- at least at this time). The package review, including links to the specfile and SRPM, is ticket #495.
When Big Data goes bad: 6 epic fails
by Donald Clark
Big data is the current buzz word in the IT world as well as in education. According to Donald Clark, "Data, in the wrong hands, whether malicious, manipulative or naïve can be downright dangerous. Indeed, when big data goes bad it can be lethal. Unfortunately the learning game is no stranger to both the abuse of data.". In the above blog posting, he listed 6 examples in which Big Data fails.
1. Data subtraction: Ken Robinson
2. Data addition: Bogus learning theory
3. Claims beyond the data – University League Tables
4. Skewed data - PISA (Programme for International Student Assessment)
5. Faked data
6. Dirty data deeds
Read some reasons for the failure of Udacity. Udacity is one of the MOOC initiatives that is meant to transform higher education.
Australis has finally landed in Firefox Nightly and this presents and excellent opportunity for all Firefox Users to help test out nightly. Mozilla relies pretty heavily on user feedback and its nightly testers to help keep each release as crisp as possible.
Installing Nightly on Ubuntu
Reporting Bugs in Firefox
Here is a very simple but robust guide on reporting bugs on Firefox
Feedback is really important to Mozilla in fact Mozilla has an entire team that focuses a lot of their attention on reading through feedback and advocating for changes in Firefox to meet the expectations and needs of users. You can submit feedback about Firefox by going to Help -> Submit Feedback or by clicking here.
This morning like Oliver Grawert I learned a variety of Linux related news sites had decided to cover a discussion from a few weeks ago on the Ubuntu Devel mailing list surrounding the possibility of a Ubuntu MATE Remix.
The coverage in the media has been pretty mixed, but I was disappointed to see some sites trying to put a spin on things and suggesting that this was Canonical being critical of Linux Mint which is not the case. In fact for the most part Clement Lefebvre has clarified that what Oli and I said was accurate that being that certain updates in Linux Mint are not enabled by default and that Firefox on LMDE (Debian Edition of Linux Mint) was not always updated as much as it probably should have been but now is updated automatically.
I wanted to post briefly and just say that my personal opinion is that Kernel and Xorg updates are important for users to have, and Ubuntu Developers are diligent in testing updates to packages and addressing regressions.
I do not think stability and performance are as big of an issue as Clem alludes them to be, and I think it is important for users to have security updates for all packages automatically without having to do anything extra to get them.
Anyways this post is not to try to convince anyone of my opinion nor was the original discussion on the mailing list.
Lets get back to making awesome free software for our users! FOSS Yeaaaah!
I’ve been spending too much time lately thinking about file systems. What started it was a casual conversation I had with Alan about work he was doing to implement a node.js style POSIX file system in JS on top of IndexedDB. At the same time I was writing about an idea I had to create a new Webmaker tool called Nimble. The basic idea behind Nimble is to integrate Adobe’s Brackets editor with Webmaker and replace Thimble.
I knew that one of the things holding Brackets back from working in the browser was the file system, so I decided to try hacking IDBFS into Brackets, and get it running in both WebKit and Gecko. I’ll skip to the end of the story and show you a demo of it working in Chrome and Firefox. You can see the contents of my db in Chrome’s resource inspector:
The timing of this work was perfect because Adobe literally landed code to change their File System API the same week. Previously Brackets assumed it was talking to the OS file system via the CEF Shell. With the new API, I was able to swap in my own implementation, and use IDBFS to handle all the various operations. Alan and I worked to add the missing methods that Brackets needed, and now it “Just Works.”
My browser-fs branch of Brackets has more details, but the majority of the code related to the file system is in src/filesystem/impls/browser/BrowserFileSystem.js. There’s still more work to do, for example, adding UI for the File Open and Save As dialogs in HTML, but this is going to work. Adobe also have another branch underway to refactor other parts of the code that assume a native appshell vs. a browser, so I’m hopeful about getting something like Nimble to work.
Back to IDBFS…doing the patches in Alan’s code to add operations I needed gave me time to think. Now that I had a node.js like file system, what else could I do? The amount of fs code already written in node.js means that we’re going to have a lot of libraries and other tools we can port to the browser pretty quickly. Another project I’ve been following with interest is js-git. The way Tim’s done the API there, you can easily swap in different backends for git storage. So I’ve started in on one for IDBFS + js-git.
Also while I was doing the Brackets work I noticed code Adobe has started for DropBox support. It made me think about the kinds of things that having a file system in your browser would let you do. Imagine going to a web page and being able to clone a repo, edit your work, push up changes, share/sync things with other cloud drives, etc, and all without installing anything. Imagine if Mozilla’s sync stuff could be taught to sync indexeddb databases you have (maybe it already can, I don’t know), and now you can do rsync between browsers/devices.
One of the things I am hoping to do in the coming weeks is add other storage backends to IDBFS. IndexedDB is cool, but limited in where it’s supported. If we add support for WebSQL and LocalStorage we hit many more browsers (obviously at a cost of storage space, but still…), we can s/IDBFS/WebFS/ and make this generally useful cross-browser.
If you’re interested in file systems and want to help with this effort, please get in touch. I’m hoping to have Alan speak to my class next week so we can make some plans about how to push on this work.
Mozilla has implemented and shipped the Web Audio API recently, and at the same time begun the work to remove the Audio Data API. I wanted to say something on the occasion of this transition, and what I learned about the web in making the Audio Data API.
In 2009 Seneca’s Centre for Development of Open Technology applied for and received one of the largest research grants available to Canadian educational institutions. Our grant (which I’m still working on today) was aimed at supporting open source development with various industry partners, one of which is Mozilla.
Some day I need to write about all the things Seneca and Mozilla have been able to do together in that time, but one of the crazy ideas we had was to “create something like canvas for audio scripting.“ It was the sort of thing that no one at Mozilla had the cycles to do–if you were working on or with Mozilla at this time, you’ll remember that shipping Firefox 4 was a somewhat busy period. I remember writing about it in the grant proposal and literally having no idea how we’d even begin. My thought was that we’d tackle this late in the grant.
First, the Audio Data API was created and built by volunteers. That’s one of the most important aspects of what we did, and why I still work on Mozilla. We often talk about the web as a democratizing technology and as an open platform for everyone. However, it’s not just that wikis are open, or that you can make your own web page. The web platform itself is open and malleable, and Mozilla is willing to enable a community of participation all the way down the stack. We didn’t make a web page; we remade the browser, and made a whole new kind of web page possible. There aren’t many other platforms where that’s even possible. The web is a different sort of platform, and Mozilla a different sort of gatekeeper.
Second, I learned that changing the web is really hard, and that without encouragement and support, you’ll fail. I can honestly say that if people like Chris Blizzard, Vlad Vukićević, Benoit Jacob, Olli Pettay, and many others hadn’t been so relentless in their cheer-leading, and gone out of their way to help and encourage us, I’d have given up. I learned a ton about how to mentor through that period by being mentored myself.
Fourth, winning doesn’t just mean you get your way. Today I’m writing a post about code I slaved over for close to a year being deleted from Mozilla, and yet I feel very much like we won. I always remember Shaver talking about Microsoft starting work on IE again as being a “win” for Mozilla, and wondering why he felt that way. Now I get it. The work we did accomplished a number of things. First, it changed peoples’ minds about what the web was capable of doing: the web is fast enough, we proved it. Second, it kickstarted the standardization of audio programming on the web, see Doug’s post here. Third, as Martin Best told me at MozFest a few weeks ago, it gave Mozilla a way to do early work on gaming, when there was no way to do programmatic audio in a browser at all.
I wanted to write this so I could take a minute to thank those that worked on this with me. I also wanted to bring attention to the fact that you can influence the web. I see too many web developers complain and not file bugs, or have ideas for how to evolve the web and not have the courage to try hacking their browser. There is this idea among some that standards need to happen before you can do any work. I totally disagree. You standardize what you know is working, and you only get there by doing experiments. What the web will do in 5 years needs to start today as some hack by a bunch of friends who have an idea. Don’t be afraid to experiment with the web.
I’m bummed out that I will be unable to attend vUDS simply because its difficult to justify taking time off from my projects for a virtual event but I wanted to quickly point out some sessions folks should check out:
Ubuntu Documentation Roundtable – Ubuntu Documentation Team is holding a session to discuss important issues, workflow and other topics for the 14.04 cycle. As always the Ubuntu Documentation Team is looking for new contributors to help keep documentation crisp for each cycle.
SDK for HTML5 App Developers – I think it’s really important for folks to start diving into HTML5 whether you’re doing it to make and app on Ubuntu Touch or Firefox OS. Either way porting across platforms will be easy once you dive in. This should be an interesting session, and it will be nice to see more HTML5 apps for Ubuntu Touch.
Ceph activities for trusty – Ceph is an awesome distributed object store and file system that’s made for scalability, performance and high reliability. The Ceph project is a great community all in its own and its great to see solid support for Ceph in Ubuntu Server.
Limiting Surveillance in Ubuntu - Ubuntu was recently awarded a Big Brother award which is an anti-privacy awardbut even before that reputable advocacy organizations such as the Electronic Frontier Foundation, Privacy International and the Free Software Foundation were critical of the Unity Scopes feature and still are today. I think it is important for Ubuntu Members and Users to continue to advocate for more user choice and privacy in Ubuntu.
Online learning, or studying online, can be a lonely affair.
However, there are many ways in which the online learner can connect with his or her course mates and tutors.
Libby Page has written an article
that gives some tips for the distance learning students.
Here are some top tips from her:
Top tips for distance learning students
• Make the most of online forums and social networks to meet other students and ask for help.
• Set yourself deadlines to help keep yourself motivated.
• Become a student rep. Student reps work on behalf of their peers and most universities have distance learning representatives. Being a rep is a great way to engage with your peers and see changes made to any issues you may have with your course.
• Even if you are short on time and not necessarily looking to make lots of new friends, making connections on your course can be a good networking opportunity, particularly if you are studying a business course.
The more and more I see stellar Ubuntu contributors say Goodbye to Ubuntu the more I think there is a growing need for the Ubuntu Community to have a foundation much like WordPress, Python and any other number of open source projects.
Right now Ubuntu is driven by decision makers at Canonical with input from the community holding the least amount of weight and often none at all. This was not always the case and the culture has definitely changed from meritocracy and strong community values to purely product focus.
Why a Foundation is Important
Simply put things like this do not happen, and better yet a foundation would allow the community to be a real stakeholder in decisions surrounding Ubuntu’s direction and more important would allow the community to fundraise for the foundation which could bring back a physical Ubuntu Developer Summit.
Matt Mullenwag on transferring the WordPress Trademark to the newly formed WordPress Foundation:
“This means that the most central piece of WordPress’s identity, its name, is now fully independent from any company.”
I believe lacking an Ubuntu Foundation or the community being real stakeholders Ubuntu will become less and less each year about community or Linux for human beings and more and more about trying to make a profit or meet the expectations of Windows and Mac users that Canonical is so badly trying to woo.
I read yesterday that the Internet Archive located in San Francisco suffered a fire which damaged equipment used to digitize physical materials. The Internet Archive plays an important role in digitally preserving a vast collection of useful information that users of the internet access daily.
I encourage folks to make a donation to the Internet Archive so they can get the funds needed to be back in normal operation.
P.S. Check out this Internet Archive record of Ubuntu.com from 2006 back when the old logo was used, and the page described Ubuntu as Linux for Human Beings. The vision has clearly changed some.
Social Media in Education
by Dr. Kumuda Gururao
You can watch a video presentation of Dr. Kumuda Gururao's presentation at this link:
The Learning Objectives
- What is meant by social media?
- Tools of social media for education
- Benefits of social media
- How to implement it for your institution
- How to curate the content from various sources
- Real life examples
MOOCs - the flipped University?
Donald Clark listed 10 reasons why MOOCs have become the flipped university:
1. Flip from supply to demand
2. Flip from offline to online
3. Flip from horizontal to vertical
4. Flips teaching to learning
5. Flips assessment
6. Flip the standards
7. Flips drop-out to drop-in
8. Flips criticism
9. Flip from inward to outward
10. Flipped University
There is an interesting report on "IT plays key role in higher education of the future
" at this website:
This event was held in the National University of Singapore on 30 Oct 2013. It is the Fourth China-India-Singapore Dialogue on Higher Education with the theme “Higher Education in the 21st Century: The Role of Technology”.
Get to know what China and India are doing in the e-learning areas.
As things stand now Ubuntu has very limited user advocacy occurring and nobody on the community or Canonical side to do any User Advocacy. Many open source projects equal in size to Ubuntu have dedicated teams focusing on User Advocacyand I believe this is one area where Ubuntu can do better.
We constantly see all kinds of new features landing in Unity and Ubuntu but are they what the users want or need? Well we really don’t know since there is not any push for User Advocacy.
I propose we start the discussion of having a User Advocacy Team in Ubuntu and make User Advocacy a priority on the Desktop. This team can be compromised of folks from various teams, governance, and we can all come together to give the experience our users want and need.
The goals of Ubuntu User Advocacy would be:
The purpose the User Advocacy effort is:
- to advocate for better understanding and consideration of Ubuntu user needs;
- to develop features on the Desktop that allow for direct feedback from users;
- to better perform research on user practices on the desktop in order to enhance the desktop experience.
Right now Ubuntu has no feature in System Settings or any Menu that allows users to easily submit feedback on their Desktop Experience and give feedback on features. This is something we should be measuring and analyzing as a community not just in Ubuntu but ideally in the flavors too.
I’m interested to here what others think about Ubuntu being improved and more feedback driven. I’m interested to hear from folks at Canonical to see what their thoughts are on a robust User Advocacy initiative.
<noscript><a href="http://polldaddy.com/poll/7535113">Take Our Poll</a></noscript>
There are a number of centers using WeBWorK that are interested insuring that our online homework system is accessible, that it meets universal design goals and can be used as widely as possible. Using MathJax to represent mathematical equations has been a big boost in that direction.
Here is a post from Portland Community College one of the active centers working on the accessibility aspect of WeBWorK. The summer MathFest conference will be held in Portland, OR in August 7-10, 2013 and there are tentative plans to hold a code camp in Portland devoted to WeBWorK accessibility on the three days proceeding MathFest. Details are still being worked out but if you are interested or have ideas or suggestions email me or Alex Jordan at Portland Community College.
Below is the report.
Report from Portland Community College
Kaela Parks: Director of Disability Services
Karen Sorensen: Accessibility Advocate for online courses
Chris Hughes, Scot Leavitt, Alex Jordan: Math faculty
Making Math More Accessible at Portland Community College
At Portland Community College (PCC), Disability Services (DS) is tasked with ensuring the accommodation process unfolds appropriately across and throughout a district serving approximately 90,000 students per year, 50,000 of whom are seeking credit. In recent years the options for curricular content format and delivery have changed considerably, bringing new barriers, but also new opportunities for making math more accessible. Many courses are now designed around online engagement points that tap vast databases, generating individualized browsing sessions any time of day or night, giving users valuable and almost instantaneous feedback. While Disability Services can convert known problem sets ahead of time, and hire aides to serve as readers and scribes, it is not practical, nor does it provide equally effective communication, to try and address barriers on the fly.
The truth is that while there will always be some need for individualized accommodation, for example creating tactile representations of graphs, there is much that can and should be done on the front end to minimize the need for manual adjustments. If online engagement points comply with established Web Content Accessibility Guidelines and use proper structural markup for math content, users who rely on text to speech, braille translation, magnification, or voice recognition, can still typically get what they need. The content is built for flexibility. However, when these best practices are not honored, there is often no way DS professionals can ensure equally effective communication. We can’t reach behind the firewall and “fix” content by adding descriptions to images, putting equations into MathML, or redesigning the interface to ensure keyboard navigation. What DS can do, and should do, is partner with Faculty, Instructional Support, and other stakeholders to help the institution recognize our shared responsibility to ensure equal access through ethical course design and curricular material adoption processes.
At PCC, online instructors develop their own courses within the learning management system. They choose the color and formatting of their text, the media, publisher materials and third party web sites and applications to use in their courses. And since the spring of 2012, all new and revised online courses paid for development by the Distance Education (DE) department are reviewed for accessibility. But how is an instructor supposed to know what’s accessible and what isn’t?
The Distance Education department has seen accessibility as an area that instructors need support. To that end, they hired an Accessibility Advocate who trains instructors and reviews courses for accessibility. And last fall (2012) the DE department co-sponsored with the Math department, two math faculty in their study of accessible mathematics. This subject specific study was so successful that the DE department hopes to emulate it with other academic program areas, especially in the STEM fields.
Math faculty members, Scot Leavitt and Chris Hughes investigated both the accessibility of content generated by the instructor and that which is delivered by homework management systems. In addition to studying commercial homework management systems such as MyMathLab, they ran a battery of accessibility tests (assisted by Math faculty Alex Jordan) on WeBWorK. The results from the WeBWorK experiments were superb- the screen reader JAWS was able to navigate easily around the web page and, most importantly, could read even the most complicated mathematical expression with the greatest of clarity.
WeBWorK is currently the only math homework management system fully endorsed by the Disability Services Office at PCC, and they are providing strong support in the creation of a dedicated server to host it. The server should be fully functional by the end of Summer 2013, and ready for wide-spread use across the college at some point within a year. Supporting WeBWorK in this way allows PCC to provide instructors with an alternative to commercial offerings that have known accessibility issues. By establishing our own WeBWorK server we ensure our community has access to a powerful homework management system that is more usable to more people more of the time. It also provides the institution with a means to ensure access for students who are enrolled in sections built around inaccessible engagement points by providing an equally effective alternative.
Some of you will remember the post My Year of Open Source from 1 January 2011 – almost 3 years ago – where I made a New Year’s resolution to participate more in FOSS. Here are the goals I listed for myself for that year:
I have four main goals (at this point):
- Learn the tools and processes myself by participating in a FOSS project.
- Figure out what FOSS tools and processes I can begin to introduce my students to in earlier courses.
- Figure out what FOSS experience(s) I can provide my non-CS students.
- Find a project (or projects) to place my Senior CS students into in Spring 2012.
Well, it was as successful as most New Year’s resolutions – meaning, not very. Or maybe, not completely. I was (partially) successful at some of those goals, although almost none were completed within the year that I so rashly promised.
Figure out what FOSS tools and processes I can begin to introduce my students to in earlier courses.
This one was somewhat successful, although not until this past June (2013) when I managed to have my summer Introduction to Programming class (all six students!) use git and Bitbucket to collaborate with their lab partners and to submit their work to me for grading. Fresh from that (small-scale) success, I tried to have my Programming for Non-CS Majors class do the same, and ran into some scaling issues. We’re working on the solution for that right now – more in a future post.
My Spring 2013 capstone project course did use git and GitHub for our project developing an app for a Worcester Art Museum exhibit. But my understanding of git was not a good as it could have been and the student use of git was spotty. We also planned to use Pivotal Tracker, but didn’t get very far. We did successfully use IRC, however.
Find a project (or projects) to place my Senior CS students into in Spring 2012.
My Spring 2012 capstone project course worked with Eucalyptus, and had some pretty strong interaction with some of the members of the community, but I think that both the students and I felt we weren’t as successful as we could have been due to some technical issues early on in the course. For Spring 2013, I abandoned working in an existing FOSS project in favor of new development when the Worcester Art Museum opportunity presented itself. We did, however, make our code freely available (https://github.com/CS-Worcester/JILOA)
Figure out what FOSS experience(s) I can provide my non-CS students.
This goal got very little attention, other than my abortive attempt at using git in the Programming for Non-CS Majors course.
Learn the tools and processes myself by participating in a FOSS project.
And I still have not made any real progress in my own participation in a FOSS project.
However, that’s all going to change. Stay tuned for My Year of Open Source v2…
Our CS 401 Software Development class was canceled on Monday, 11 February 2013 due to ongoing snow removal and cleanup on campus from the Nemo blizzard. (Worcester received 28.5 inches of snow in just about 24 hours.) This is a problem for a class that meets only on Mondays, especially with the next Monday being a holiday.
As soon as the campus closing was announced on Sunday afternoon, I emailed the students to announce that we would replace class the next day with an IRC (Internet Relay Chat) meeting. (Actually, that’s a lie. The first thing I did was panic, then I screamed, then I ranted to my family about the injustice of cancelling my Monday-only class. Then I thought about holding class on IRC…) Here is the message I sent the students on our class listserv:
Campus is closed tomorrow, so we will not have class. We will not have class next week either due to the President’s Day holiday.
This is going to seriously mess up our schedule. I’ll think about how we can carry on in the two weeks.
Let’s try to hold an IRC chat tomorrow during class time (2:00pm-4:30pm). I’ll send out instructions on installing (optional) and using an IRC client later tonight. I have instructions already written up, I just have to find them, possibly update them, and send them out.
Holding class on IRC would be a little bit of a challenge since the students had not used IRC yet, so this would have to serve as both an IRC familiarization exercise and a useful meeting. I sent them the following message to prepare them:
We are going to meet today on IRC (Internet Relay Chat) at 2:00pm.
You should read through this in advance so that you are prepared. Especially if you are going to install an IRC client – you will need time to set it up. I suggest trying this out at least 1/2 hour in advance to make sure you get the connection working. I’ll stay on IRC all day so you can try out chatting.
You have two choices for connecting to the IRC server:
- Install an IRC client. There are many available, you may want to try a few to see which you like the best. Some are standalone applications, and some are browser plugins (like Chatzilla for Firefox.) I’ve heard that mIRC is the most popular for Windows, I use Colloquy on the Mac.
Here are some of the most important settings you will need. How you set these will depend on your client. You will want to install your client and do the setup in advance of our meeting, so you aren’t late.
- Server: irc.freenode.net
- If you can set a port, you may want to use 7000 since it can be used for an SSL connection.
- Nickname: Choose your own*
- Channel: ##WSU-CS401
- Use the webchat page on freenode: https://webchat.freenode.net
- Nickname: Choose your own*
- Channels: ##WSU-CS401
- Complete the reCAPTCHA
* You may want to register your nickname, so that no one else can use it. That way we can all get used to looking for a specific nickname for you. See the instructions: http://freenode.net/faq.shtml#registering
The most important commands which chatting:
- /SERVER new-server-hostname
- /NICK new-nickname
- /JOIN #channelname
- /ME does something
This command is used for saying that you are doing something like:
/ME is looking for that information in my email
- If you want to address your comments to everybody, just type your comment and hit return.
- If you want to address your comments to a specific person, type their nickname followed by a colon, then your message. E.g.
kwurst: I have the answer to your question
I had created a course-specific channel on freenode last spring, so we could use that channel, but to hold a useful meeting, felt that it would be vital to have a MeetBot running to take minutes. I could have used used the #teachingopensource channel, which has zodbot installed, but then the minutes would be saved on Fedora’s website, rather than ours. So I decided to install Supybot with the MeetBot plugin on our own server here.
I managed to get MeetBot installed (mostly – gives me an error message for every meeting command I give, but then does it anyway) and we had a very successful meeting for a class of IRC newbies: http://cs.worcester.edu/kwurst/wsu-cs401/2013/wsu-cs401.2013-02-11-21.13.html
BitTorrent, creators of the highly popular distributed peer-to-peer file sharing protocol, recently released BitTorrent Sync, a solution for syncing folders between machines based on the BitTorrent protocol. BTSync provides a fully distributed and encrypted alternative to services like Dropbox where all your data is synced through a third-party server.
BTSync has been released for Windows, Mac, Linux and other platforms, although the user experience on Linux isn’t quite as polished as it’s counterparts – the only interface provided is via a local webserver accessed through your browser, while Windows and Mac get a nice desktop GUI with a system tray indicator. I found this a pain as I’d sometimes finish making changes to a synced file and want to shut my computer down quickly, but had to open my browser first to check if the file had finished syncing.
While BTSync isn’t Open Source, the developers are very open to feedback from users and developers. I quickly realised that I’d be able to use data from the web interface to create a desktop indicator for Linux, so in the open source tradition of scratching my own itch, I wrote a python script that gave me an indicator to show if a file was syncing. When it was workable, I stuck it on Github with an open source licence and made a post on the BitTorrent Labs forum.
I then noticed another post on the forum by a developer called Leo Moll – he was packaging BitTorrent Sync for Ubuntu and Debian distributions, and as I’d written my script with Ubuntu in mind, asked if he’d like to include it in his packages. He agreed and before long my indicator could be installed alongside a well integrated BitTorrent Sync client.
Here’s when things really took off. With it being so easy to get hold of my indicator, people started using it and reporting bugs on the GitHub page. Almost as quickly, they started submitting patches. I got a new set of better animated icons for the indicator, various bugfixes for cases I’d not come across, new feature requests, and even someone packaging the indicator for Arch Linux.
Alongside this Leo and I were contacted by another developer who was packaging BitTorrent Sync for Debian and Ubuntu. We had a discussion and worked out where best to focus our efforts to avoid duplicating each other’s work and creating conflicting packages. Leo and I are now discussing merging our codebases to streamline our work and allow for better integration.
In the space of a month, what started as a little hack to make my life a little bit easier has become a vibrant project with an engaged community of developers and users. The real key, I think, has been to make it as simple as possible to let users run the software, and to show I’m listening and responsive to feedback.
Our “Gender/Race/Class in engineering education” class has an “open topic” period that I’ve volunteered to help design… which means I’m going to Ask The Internet for help. (Hi!)
Based on our class discussion just now, we are interested in tackling this question: How do we interrupt the discourse that perpetuates inequity in engineering education? (Subquestions: who has access to this discourse as a listener? A speaker? What is that access based on — gender, race, class… age? geography? language? disability? intersections of any subset of that? What strategies do we have for doing this dialogue-interrupting work in professional and personal contexts?)
The course will be Monday, November 18, which is 2 weeks from now. We’re mostly PhD students in engineering education (technical backgrounds, social science research interests, lots of future engineering professors who care deeply about teaching). We have 3 hours in class, plus the ability to ask people to read a reasonable amount (<100 pages, English) before class. I’d love to hear thoughts, especially half-baked ones, on:
- “learning objective” suggestions — in other words, what do we want to learn during the course of the 3 hours? (Can be fact-based, skill-based, emotion-based, perspective-expanding-based…)
- “assessment” suggestions — given those learning objectives, how will we tell (at the end of the 3 hours) whether we’ve learned them, and how well? Does not need to be a test; could be questions for reflection on our own, etc.
- Reading suggestions — scholarly or not. (For instance, Alice Pawley has offered to let us read her CAREER proposal on feminist engineering — a short, highly competitive grant for junior scholars whose committee was probably not used to getting “feminist” proposals.)
- Activity suggestions — discussions, games to play, short bits of theatre to act out and/or improvise upon, provocative question prompts, etc…
Potential inspiration: our guiding question/framing about “interrupting discourse” came from a discussion on “how do we talk to people about this?” and an interest in intersectionality, especially with disability/access. I’m personally curious about the history of opening these dialogues in STEM: who (tenured? white? male? western?) started the conversations about women in physics, minority races in computing, wheelchair-accessible chemistry labs, etc — and when, and how, and what were the responses?
Comment away! I will post readings (or reading notes, if readings are not freely available), discussion questions/guidelines, and a story of what happened in the class once we run it — basically, whatever I can do to make the experience we’re creating here available and reusable by more people.
Bzr on MacOSX Mavericks
Using Bazaar on MacOSX is a cinch with Homebrew all you need to do to get started is:
1. In Terminal paste the following:
ruby -e "$(curl -fsSL https://raw.github.com/mxcl/homebrew/go)"
2. Then install bzr from Homebrew: brew install bzr
3. Define where Python is using the following command: export PYTHONPATH=/usr/local/lib/python2.7/site-packages
Then you should be good to go!
I can’t believe that it has been months since I posted! I’ve been trying to get this post up for weeks now and I’m just going to put it out there and give updates when I can.
Some of the foss2serve group have been working to bring the MouseTrap project to be current with GNOME. MouseTrap uses a low-cost webcam to interpret a user’s head movement to control the cursor. The project has only had language updates since mid-2010 and needs to be updated to GNOME 3 and Python 3.
Several Western New England University, Drexel University, and Nassau Community College students and professors started working on the project at the end of 2012. This fall, six CS seniors in my CS 490 Software Engineering course are working towards getting the code current. If you want to join us, we’re using the gnome3-wip branch. We’d love company on our adventure into cursor movement via the webcam!
Our approach is to do some development in parallel with building documentation infrastructure for the project. We’ve got a set of requirements up and are working on design which we’ll post in a couple of weeks. We’ve also been fixing and filing bugs as we go.
Stoney Jackson and I are managing the project together and Stoney has been chronicling his learning. He has become a git expert and we’ve both learned loads about how to manage bugs and enhancements in Bugzilla and how git works. We’re also learning about how to better operate within the GNOME A11y community.
In related exciting news, the Software Engineering students from Western New England University went to GNOME Summit in Montreal. They spent time hacking and learning. Many thanks to Joanie Diggs, Karen Sandler and Ryan Lortie. Joanie and Ryan helped hugely in helping push MouseTrap code along and Karen provided insight into licensing and open source culter. There also appeared to be some time for fun as well!
CS 490 students enjoying the GNOME Summit!
Commit messages are an important part of how software is developed, debugged and maintained, and when done badly can become an unnecessary barrier to collaboration in open source projects.
Bad commit messages make it harder to figure out where problems have been introduced, especially for newcomers to a project.
The worst-case scenario for anyone trying to make sense of changes to a project is a commit message that offers basically no information for a major change affecting multiple locations in the code.
To get a good sense of how commit messages are useful, take a project and look at its history in the revision system. You’ll see something like this:
- Revision 1525597: Add ap_errorlog_provider to make ErrorLog logging modular. Move syslog support from core to new mod_syslog.
- Revision 1514267: tweak syntax strings for ServerTokens
- Revision 1514255: follow-up to r813376: finish reverting r808965 (ServerTokens set foo)
- Revision 1506474: server/core.c (set_document_root): Improve error message for inaccessible docroot.
- Revision 1498880: Replace pre_htaccess hook with more flexible open_htaccess hook
Or, if you’re unlucky, you might see something like this:
- Revision 1525597: fixed it
- Revision 1514267: more changes
- Revision 1514255: bug fixes
- Revision 1506474: more improvements
- Revision 1498880: lots of changes
If you now imagine you’re looking to find out where, say, the ServerTokens syntax changed, you can see the value of providing good commit messages.
So, how can you write better commit messages? Below are some top tips.
Commit messages should be brief and easy to scan. Often the reader of commit messages is viewing then in a log or revision history, so make sure the most important words and phrases stand out.
There is no hard rule about this. Some developers prefer an approach of having a very short one-line message but with optional subsequent paragraphs of context and description, whereas others prefer to only provide one line of any length, and link to detailed explanations elsewhere, such as in the issue tracker.
However, you should use your common sense as to how much information should be in the commit message. If you find you’re writing lots of explanatory text, maybe you need to put more comments in the code itself where the changes are made, or add more detail to an issue in the tracker.
Make messages easier to find when searching
As well as scanning the revision history, developers also search logs using
grep or similar tools. In which case its important to use the best terms for discovery. For example, if you use component or module names, make sure you spell them correctly and use them consistently. For example, if its a component called “DownloadManager” don’t use “Download Manager”.
Commit messages can also turn up in search engines, either project-specific searches or in regular web search engines. So its important to be clear and consistent in language use.
Provide sufficient context
While brevity is desirable, commit messages need sufficient context to be useful.
For a one-line fix, you can always view the diff to see what changed, but if a commit affects multiple files or multiple lines of code, it needs more explanation so that other developers and users can re-establish the context of the commit.
Peter Hutterer suggests a commit message needs to answer three questions:
- Why is it necessary? It may fix a bug, it may add a feature, it may improve performance, reliabilty, stability, or just be a change for the sake of correctness.
- How does it address the issue? For short obvious patches this part can be omitted, but it should be a high level description of what the approach was.
- What effects does the patch have? (In addition to the obvious ones, this may include benchmarks, side effects, etc.)
You don’t need to go into a lot of depth, but you need to capture enough of what is going on that someone reading the revision history can get a sense of what your commit did without having to look at all the diffs.
Added unicode support for imported files to prevent encoding errors in article.title
This doesn’t necessarily need to be in the message itself – for example, if there has been a discussion on the mailing list, or there is plenty of information in a related issue on the project tracker, then you can include a reference or link to this in the commit message.
Added unicode support for imported files to prevent encoding errors in article.title (see bug #1345)
Some issue trackers can also link commits to an issue automatically based on the commit message, in which case you need to make sure you’re using the correct format for it to pick this up.
Provide credit and recognition where it is due
While you may be committing the changes, you may not in fact be the author – if you’re applying someone else’s changes, you need to acknowledge the fact and give the author recognition. Even if its not a complete submitted patch, but just a “if you change x to y that would fix the bug”, its worth putting in an acknowledgement.
Added unicode support for imported files to prevent encoding errors in article.title (see bug #1345). Thanks to Jane Doe for the patch
This has both a social function (placing credit where it is due) and also provides an audit trail.
(Some projects prefer a more formal “Submitted by: <username>” but I like to just say “thanks to <username>”.)
Troy Hunt provides another rule of thumb for commit messages: subsequent commit messages from the same author should never be identical.
This is partly because it makes it more difficult to distinguish changes in the version history, and partly because each change should, logically, be different to the last.
Try not to swear or insult anyone
Fixed stupid $$&!! mistake caused by £$%$%@ Steve
OK, it is difficult sometimes, but lets keep things professional. Save your venting for the IRC channel
More seriously, commit messages form part of the overall tone of communications for a project; snarky, rude and unhelpful commit messages don’t put your community in a good light, particularly for newcomers.
Check the logs to see how you’re doing
Every now and again its worth checking your log or revision history for your project, and reviewing the last page or so of commit messages. Would somehow relatively new to the project get a good idea of what was happening? Can you improve the usefulness of the messages that you and your community members are writing?
Follow project guidelines
Your project might have a preferred fromat for commit messages, so make sure you find out before making a commit.
For example, Moodle’s commit message guidelines call for a message subject line consisting of the issue number followed by component name, and the rest of the subject up to 72 characters.
I’d love to hear any more suggestions for better commit messages (or your worst examples of bad practice!)
For a random commit message, give WhatTheCommit a whirl
Photo credit: Wilson Afonso
I’ve just come back from a weekend in Liverpool helping organise and run OggCamp, the biggest Free Culture community event in the UK. While I can’t say for sure that it had the highest attendance of its 5-year run (numbers have been pretty much stable for the past 3 years), there’s several ways in which I think it was the best.
OggCamp is an “unconference”, meaning most of the schedule is decided on the day, and the sessions are delivered by the attendees. We also have a scheduled track covering topics including:
OggCamp is a free event organised and run by volunteers, and as such those speakers to are scheduled talk for free as well. Having speakers in our community willing to deliver professional-quality talks on such a range of topics is one of the things that made this event great.
That’s not to mention the unconference talks which included:
When you’ve organised a 2-day event with no schedule, there’s always the prospect of people turning up and no talks being offered, so having such an amazing community who come every year and make the event what it is was fantastic.
The final thing that made this OggCamp amazing was the support from our sponsors and community. This year, we introduced a pay-what-you-want system where, if people wanted, they could donate to the event when they signed up for their free ticket. This proved to be massively successful, and led to our community being our biggest single cash sponsor. On top of that we had a huge amount of support from companies such as Bytemark and Canonical, which meant that we could provide a really fun experience for attendees, including free play arcade machines, some excellent raffle prizes, and a posh venue for the evening social.
We’ve already had some excellent ideas from the community for next year. See you in 2014…
Over the Columbus Day weekend I attended the GNOME Montreal Summit with my Software Engineering class. The Summit was a hack-fest where GNOME developers and contributors get together to get things done. During this summit, we met many GNOME developers and contributors. Many of the developers took time out of their schedules during the weekend to talk to us. They spent time showing us ways to improve our current development process, like using jhbuild. The director of GNOME also took time out of her schedule to talk to the class about licensing in GNOME, and why OSS is so great.
Besides just interacting with the GNOME community during the days at the conference, we got to interact with them at a couple social events. The community members were very welcoming to the class as newcomers, and included us in their activities. It was a great experience to get to collaborate with the community both in a development environment, as well as in a social setting.
In general, learning in an OSS community is very different from learning in a typical class. The knowledge that I have acquired while working on the MouseTrap project and interacting with the HFOSS community has taken the theoretical applications from class and applying them to a real situation. I have learned how to effectively search for answers on my own, without having the answers told to me in a lecture. I have spent a lot of time searching for answers on wikis and online documents, and learning how to extrapolate answers to my problems based on the solutions I found online. This skill will help me in my future professional life, as I will not have the answers to my problems lectured to me. Being able to learn on my own, and solve problems based on existing solutions will be critical to my success in the future.
My Software Engineering class has introduced me to the world of Open Source Software(OSS) and all of the benefits it has to offer. As a new-comer to Open Source software it was very intimidating to even think about contributing to a project significantly larger than anything I had ever seen before, in a language in which I did not have much experience at all. Overall, the community has been amazing and shown me what Open Source Software is really all about. My entire perspective on why Open Source software even exists in the first place has changed dramatically. In my eyes Open Source has gone from more of an idea or concept to a matter of principal; users should be free to exercise the following four essential freedoms:
- The freedom to run the program, for any purpose
- The freedom to study how the program works, and change it so it does our computing as you wish.
- The freedom to redistribute copies so you can help your neighbor
- The freedom to distribute copies of you modified versions to others.
I have learned that in order to create better software, as a society, it must be Open Source. This means that we do not limit ourselfs to a company, or even a small group of collaborators within a company as in most cases but instead leave it open to the entire world. This allows software that exhibits more stability, reliability, security, usability, diversity, testability, flexibility, and most importantly modifiability. Not only does Open Source Software come with all of these benefits, but it does it with faster development time and at a lower cost in most cases. It wasn’t until the vast benefits of Open Source Software were laid out in front of me that I realized how essential of a concept it is in today’s world, it allows for great and passionate minds who may have otherwise not even been introduced to Software Engineering to thrive and as such benefit the entire community.
My experience at the Montreal Summit really drove home the concept of Open Source Software and its benefits. I hadn’t had any sort of experience outside of a “regular” class with regard to Software Engineering / programming up until this event. It was amazing to see so many brilliant, passionate, individuals all in one place contributing in many cases just for the love of making the product better. Many contribute for their job, for fun, to help others, to learn, or they simply want the software to work ‘their’ way. I don’t imagine I will find such a diverse group of individuals in any office I will work at in the future, and I would be quite surprised to find coworkers as passionate about their work in a typical office setting. Graduating soon, I find myself constantly trying to prove myself to a potential employer to even be considered worthy of contributing to their code base, which will arguably not be as large or broad as a well established Open Source repository(like GNOME). I found the opposite to be true at the Gnome Summit, everybody was more than willing to lend a hand; and in many cases particular individuals spent large amounts of their time assisting us in any way they could. I could not have imagined a warmer welcome to the Open Source community. There were many people who were eagar to explain to us the broad benefits of Open Source(both legally and practically) to helping with terminal commands and down to suggesting better tools like jhbuild and how to configure and use them to make us better more efficient contributors to the project. My learning experience in Open Source thus far has been invaluable, and will no doubt help me in my professional life as I enter the working world as a more informed individual. I fully intend to continue to contributing to Open Source Software to the best of my ability as I learn and develop as a Software Engineer, i see it as the golden ticket to staying current and learning from the best in the business.
It’s been about two months since I was first introduced to the open source project, MouseTrap. In that time, I’ve been thrown into the world of Free Open Source Software, FOSS, and I’ve been given a little insight into just how it works. In our class, we’ve had traditional lectures but our learning goes beyond that. We’ve not only communicated with big contributors in to FOSS community, but we’ve also had the opportunity to work alongside of them even if for a brief time.
In our class, we’ve mainly focused on the actual software engineering side of the coin which entails the documentation that kick starts a project. This part of it has felt exactly like a “regular class” where we go over different documentation techniques and then apply what was learned to the project at hand; in our case, MouseTrap.
What is different, though, is that we are constantly connected with the community. We’re almost always connected to the community via irc, at least in class. We also had the opportunity to go to the GNOME Developer Conference and not only listen in on what exactly these developers were working on but we also had the opportunity to ask them questions related to our project and some even took time out to sit down with us to help. For these reasons, learning in a FOSS project feels a lot more hands on than classes we’ve taken before which have relied more on theory. I’m soon going to be jumping into the professional world and what working with FOSS has taught me is that communication is vital to any project. Most of these people have never met in real life or even live in the same country but they’ve been brought together by the same project with their goal of providing a complete product for anyone to use and I think that’s just fantastic.
In short, I’ve enjoyed this style of learning. It not only allows us to stick to the professional side, with the documentation, but it also allows us to delve into the meat and potatoes of it all which is the code. The members of FOSS that I’ve met on my journey thus far have been incredibly helpful and patient as I and the rest of the class settle into our new environment. Their drive makes me really want to stick with FOSS after this class just so that I can one day emulate their work and hopefully spread FOSS further.
Since I decided to cancel my Ubuntu Membership today my e-mail firstname.lastname@example.org will cease to function shortly. I can however be reached via my contact page and will be updating my bug accounts upstream to reflect this change.
Visualizing Data, Rearchitecting Stacks, and Translating Music at HackUpstate
HackUpstate is a regional hackathon with the mission
"to unite and facilitate collaboration among greater Upstate New York hacker communities. In doing so, we aim to contribute to the growth of Upstate NY Tech sector, and create a robust network of technologists and tech companies."
Searching for Code with Smart Scopes on Ubuntu 13.10
I’m really pleased with how the dash is becoming a place for more productivity in 13.10 with the latest polish of Smart Scopes. Searching for code on Github is made easy or finding a that new remix by Wild Boyz on sound cloud is a cinch.
The Ubuntu Kernel Team dropped the latest and great Kernel from upstream into Ubuntu which has for me seen some small improvements in performance and a slight decrease in power consumption. Additionally wireless seems to be a bit more stable with this kernel.
The latest version of Firefox shipping in Ubuntu is 24.0 which has a lot of great fixes and now and beats Chrome/Chromium in recent benchmarks. Although I use the Firefox Nightly version its good to see that Mozilla Firefox is still shipping as the default browser in Ubuntu since there was some debate about changing to Chromium.
All in all a small release for the desktop with minimal changes but the Kernel and Smart Scopes as changes alone were good. Looking forward to 14.04 LTS which should be a very crisp release with an even newer kernel and some nice fresh packages from upstream for all the default applications in Ubuntu.
Today, after a few days of work, I finally got
RetroPie up and running at an
acceptable speed. It was a little longer and a little more difficult than
anticipated, so it makes sense to blog about it.
SD Card Selection
Surprisingly, your SD Card makes a significant difference in the time it takes
to load games and write out game state. The other, non-obvious, impact of the SD
card is contention for IO when other tasks on your pi are reading/writing
from/to the SD card.
I had initially chosen a slow, older class 4 card. Whenever some other process
wrote out to disk, my games would noticeably lag a bit. I haven't had that
problem since switching to a Class 10 though.
Don't just rely on the class number to give you information about the
performance characteristics of the SD card. Checkout this handy
list before buying.
Go with vanilla Raspbian.
I've played with Darkbasic's
PiBang, and vanilla Raspbian for the host image.
Raspbian was the fastest by far (after you disable all of the crap, of
course). It's also convenient to have raspi-config and raspi-update installed
from the getgo. RetroPie expects that you're running Raspbian, and you can hit
some interesting behaviour if you aren't.
While I initially liked PiBang, systemd is a poor choice for the pi if you want
to emulate. I noticed
as I was playing games, systemd was eating a large portion of my CPU, and would
occasionally cause lag when there was a sudden burst of activity. Moving to a
non-systemd distro (Darkbasic's Raspbian) helped significantly.
Darkbasic's Raspbian was slow enough by itself. I never worked out the reason,
since it uses the same repos as vanilla raspbian, with all of the same
optimizations, however performance just on the terminal was noticeably slower.
I'll post benchmarks later (possibly never).
Darkbasic's Raspbian image is also very minimal, and not configured the same way
as vanilla Raspbian. This can lead to some interesting issues when trying to
build and run RetroArch (more later).
If you choose not to install Raspbian, and use some minimal RPi distro, I'll
include some helpful hints throughout (I set this up several times on the above
Let's get started
Install your favorite image normally. Don't stop any services just yet or do any
configuring just yet. Stick to a vanilla install, since that's what RetroPie
expects (and it doesn't like any deviation).
Add an input group.
addgroup --system input
Make sure that
/opt/vc/lib is early in your linker path
/etc/ld.so.conf.d. Some distros will install versions of
libEGL and related graphics libraries earlier in your library path. These
generally do not work and will throw weird, un-Googleable errors.
If you see
libEGL warning: DRI2: xcb_connect failed
libEGL warning: GLX: XOpenDisplay failed
when starting emulationstation that's symptomatic of this problem. The libraries
/opt/vc/lib are built to use the Raspberry Pi's VideoCore chip.
Pull down the source
After you've logged in and gone through the initial setup for your pi, pull down
the RetroPie-Setup source.
git clone git://github.com/petrockblog/RetroPie-Setup.git
Before we compile, I'd recommend setting your CFLAGS. Thanks to a patch I
submitted, RetroPie-Setup now has some sane defaults (
-O2 -pipe -mfpu=vfp -march=armv6j -mfloat-abi=hard), but those aren't
fast. I'd recommend
-Ofast -fno-fast-math -pipe -mfpu=vfp -march=armv6j -mfloat-abi=hard.
-Ofast turns on the fastest optimization level, while potentially building
unsafe code. It works quite well for me, but YMMV.
-fno-fast-math turns off a GCC optimization that -Ofast enables.
fast-math uses some "faster" math that doesn't give you the guarantees of some
of the standard math functions. In some cases, it's also
-march=arm6j both tell
gcc to generate code
specific to the raspi architecture, as well as what kind of fpu-specific machien
code to generate. These two flags are both used by the raspbian project to build
-mfloat-abi=hard tells the compiler to generate fpu-specific
instructions. Without this flag,
gcc will generate floating point code
that will not run on the fpu, but will link to library functions that emulate
floating-point code. This is slow. All raspbian packages are built with this
flag (that's what puts the 'hf' in 'armhf').
If you're worried about the problems the new flags could introduce, then stick
to the defaults. If you're just reading this and you already have RetroPie
installed, you should probably rebuild to get the new compiler options ;).
Set the CFLAGS with
export CFLAGS=-Ofast -fno-fast-math -pipe -mfpu=vfp -march=armv6j -mfloat-abi=hard
After all of this, it's just a quick
sudo ./retropie_setup.sh. Pick and
choose what you want; it's really none of my business, and it's quite easy.
Make sure you pick the source-based builds (I guarantee that the binary ones
weren't built with any good options) and that you tell the installer to start
emulationstation on boot.
update-rc.d avahi-daemon remove
update-rc.d bluetooth remove
update-rc.d cron remove
update-rc.d cups remove
update-rc.d dbus remove
update-rc.d ifplugd remove
update-rc.d lightdm remove
update-rc.d nfs-common remove
update-rc.d rpcbind remove
update-rc.d rsync remove
update-rc.d rsyslog remove
update-rc.d saned remove
update-rc.d triggerhappy remove
update-rc.d ntp remove
K06rpcbind README S02dphys-swapfile S02ssh S05plymouth S05rmnologin
I use ssh to manage the system, so I leave that. Everything else must go.
Don't get me wrong, usually I like logging,
cron, but it's just not
necessary on a pi that's dedicated to playing games.
PiBang uses this half-assed systemd with sysvinit compatibility. It's a pain in
the ass. Almost all of the packages that need to start at boot that you get on
PiBang need to be disabled with
update-rc.d. Some others though need to be
systemctl disable <service>. I'll leave it as an exercise to the
reader to figure out how to tell.
Minimal image notes
You're probably sitting there laughing at how much bloat Raspbian comes with.
While that's fair, keep in mind that the RetroPie script installs a ton of shit,
cups and sane included. Make sure you audit your startup as well.
Configuring a joystick
Your joystick needs to be configured before you can start playing games.
Emulation Station will configure it for you when you first start it up, however
that configuration only applies to Emulation Station. You need to configure your
emulators on their own.
./retroarch-joyconfig -o p<playernumber>.cfg -p <playernumber> -j <joysticknumber>
# joysticks are 0 indexed (they map to /dev/input/jsn
# players are 1 indexed
cat p*.cfg >> ~/RetroPie/configs/all/retroarch.cfg
You'll need a
udev rule to automatically give the
input group permissions to
access the joystick devices. Drop the following into
SUBSYSTEM=="input", GROUP="input", MODE="0660"
This ensures that the
input group owns any new input devices, and that they
But moooom, I want to use an XBox controller
No. You don't.
You can use
xboxdrv to hook up your XBox controller. It's a userspace driver
that talks to the raw joystick device and interprets the XBox controllers's
raw output into something meaningful.
Since the driver lives in userspace, anytime you twiddle a joystick or press a
button, you're hopping like mad between userspace and kernelspace. This is slow.
I could actually emulate the slow-mo feature provided by RetroArch just by
twiddle both joysticks.
No. I'm a special snowflake and I demand to know how to use an XBox controller
apt-get install xboxdrv
echo xboxdrv --silent > /etc/rc.local
If you don't want to reboot, just execute
xboxdrv --silent &
You've been warned.
Starting emulationstation on boot
Luckily, RetroPie did half of the work for you! If you selected the option
(either in setup or install) to start
emulationstation on boot, it'll drop a
handy snippet into /etc/profile which will do that on login.
However, you still need to login.
To alleviate that, edit your inittab.
1:2345:respawn:/sbin/getty --autologin pi --noclear 38400 tty1
#1:2345:respawn:/sbin/getty --noclear 38400 tty1
#2:23:respawn:/sbin/getty 38400 tty2
#3:23:respawn:/sbin/getty 38400 tty3
#4:23:respawn:/sbin/getty 38400 tty4
#5:23:respawn:/sbin/getty 38400 tty5
#6:23:respawn:/sbin/getty 38400 tty6
Go to line 45 (on raspbian, or where the gettys are started), and enable
autologin for whatever user is going to run RetroArch.
In the spirit of disabling everything, I also stopped all of the other gettys
Should I overclock?
Well, it depends. Want to run SNES games that have Mode-7 graphics? Or any of
the SNES games that packaged an extra processor on the cartridge? Yes.
In most other situations, you're fine if you follow this guide. The one thing I
would recommend doing, however, is changing the memory/gpu mem split. I gave my
GPU 384MB of memory. I'm barely breaking 20MB of memory usage running my pi, so
I don't need very much system memory.
Your game will start slowing down as the proc
overheats, so make sure you have a heatsink and a case with some kind of airflow
if you plan on overclocking.
Notes for non-raspbian users
Yeah, you probably do want to overclock. If your distro is compiled with dumb
options, it'll help a bit.
On both Darkbasic's raspbian and PiBang, I had to overclock to get decent
That's about as perfect a setup as you can get for RetroPie. If you have any
other suggestions, leave them in the comments. I didn't configure RetroArch
anymore than shown above, because well, I didn't need to. For all of my ROMs
built by the homebrew community that are totally free and legal to play, I ran
into no slowdown with this setup on Raspbian.
However, YMMV, as always.
Writer: Inès Joëlle NIRAGIRA
ICTDEVers is a group of local ICT for DEVelopment practitioners and academics (students and their supervisors). They meet monthly to link up ICT4D projects and the people behind them. Through these get-togethers members get the opportunity to showcase what they are doing at their institutions and receive/offer help. Most attendees are currently postgraduate students from the University of Cape Town, University of Western Cape and Cape Peninsula University of Technology whose research area and interest are in ICT4D. There was also an academic from the University of KwaZulu-Natal, who flew to Cape Town to attend the event.
ICTDEVers get-togethers take place once a month and meeting venues rotate amongst the three institutions (UCT, UWC and CPUT).
The October Get-together
The October get-together took place at Cape Peninsula University of Technology and was chaired by Dr. Michael Adeyeye. A crowd of over a forty enthusiastic people were gathered in the CENCRA room, like the previous ICTDevers meeting at CPUT
Attendees were given a chance to introduce themselves and briefly talk about their research projects and research interests.
On the Agenda
Dr Michael briefed everyone about an event that had recently taken place at CPUT: the Free Open Source and Hardware Symposium (FOSHS’13). Topics from the FOSHS’13 were to be discussed during the ICTDEVers meeting and members were invited to engage in discussions about the various topics namely:
- Mozilla: The Firefox OS Tools and Apps
- ZeroReserve: The Bitcoin Reserve Bank
Mozilla: The Firefox OS Tools and Apps
Brand new features recently implemented in Mozilla Firefox were presented, including an overview of the Firefox web developer tools, overview of the Firefox OS concept and design, the Firefox OS phone simulator (currently an add-on to the browser) and a physical demonstration of the phone. Discussion on how to deploy apps to the Firefox phone and the features included in it closed the presentation sessions.
Attendees got a chance to ask various questions about the gadget’s features and make comments on how well they think the phone fits in the smartphones’ current tight competition.
ZeroReserve: The Bitcoin Reserve Bank
“Very first time I hear of the word ‘Bitcoin’, am I the only one here? I wonder” exclaimed one of the supervisors as soon as the topic slide was displayed on the screen.
“I think having heard of ’Bitcoin’ before but I’m just wondering how is it related to the reason why we are here tonight” said another attendee.
“A completely new framework for understanding money, trade and financial management is embodied within the proposal of the ZeroReserve” Dr M Adeyeye presented. Whether Bitcoins are “open source money”, “soft money” or a virtual monetary representation of trade remains an open issue. Emphasis was placed on the possibility of using Bitcoins and setting up an organized ZeroReserve as to gain financial independence for African nations, especially from Western countries controlling the main monetary organizations in the world and from excessive dependence upon the strongest currencies in the world, such as the United States dollar and the European euro. “Currently, ZeroReserve is a prototype Linux application which does not require a banking gateway and implemented as a plug-in for Retroshare (an open source cross-platform, peer-to-peer, secure and decentralised communication platform)” he added.
Attendees were pretty intrigued by the topic and spent good amount of time discussing about it and asking questions on how known is the virtual currency on the African continent.
Closing and Socialising
Various matters including the ICTDEVers following meeting venue and the upcoming International Conference on ICT and Development to be held at UCT in December were discussed. Attendees were offered finger supper and drinks.
The Thunderbird Team is interested in knowing what your favorite add-on’s for Thunderbird are, and we will be restarting the “Thunderbird Add-on for the week” across all of our social media channels. If you have an add-on that you know of that you think deserves highlighting please leave a comment on my blog, ping me on IRC (bkerensa in #Thunderbird on Mozilla IRC) and I will be sure to check it out.
You can also check out all the newest Thunderbird Add-ons right here and be sure to follow us on Facebook, Twitter and Google+!
Two Sundays back was the last day of Mozilla Summit 2013 where thousands of Mozillians came together in three cities to discuss the future of the open web. Summit was everything I had hoped for and much more there were moments as a site host lead that were challenging but as an attendee there were moments that were exciting, fun and powerful.
Me and Mozillians from India
I have to say that the closing reception speech by Mitchell Baker was very powerful because it really made it clear how important the role Mozillians play in building an internet the world needs. For me summit was a great community experience it really showed how big the Mozilla Community is and how we all have that one common mission regardless of where we are from and what language we speak.
Mozilla Pakistan Booth
I was very impressed with the world and innovation fairs because each event offered a line up of really interesting booths. My favorite booths at the World Fair were the Pakistan, India, Romania and WoMoz Community booths. At the Innovation Fair I have to be honest I spent a lot of time talking up Asa Dotzler about Firefox Metro and after that I spent time checking out the UX table which showed me some previews of potential new layouts for Mozilla.org.
It was also really cool to see some automated battery life testing for Firefox OS being doing on Xubuntu the XFCE Community-Driven Flavor of Ubuntu.
I think this summit did one thing for a lot of us, and that was build up energy and momentum that we can take back to various projects were working on, and that will help fuel us until the next time we all come together again.
Wa-pa-pa-pa-pa-pa-pow! and check out my photos from Summit!
Free Open Source and Hardware Symposium (FOSHS ’13)
Bringing together industry, developers, educators, the community and any other interested parties to discuss open source, open hardware, open web and academic/industry partnerships.
Department of Information Technology, Cape Peninsula University of Technology
October 10-11, 2013
Ernesto Gomez Tagle G (Violetta Platar)
Abdel Wahid Sabre Ousman (Ben Sabre Fils)
I. Introduction and Expectations
Although it may be indeed too early to declare that open source software and hardware constitutes a trend in IT business and practice, it is clear that its relevance is increasing in recent times, with a large number of applications and solutions comprising these standards. By “open source” it should be understood that an application or device source code or design can be freely replicated and modified by others, provided that credit is given to the original designer and that the derivative work is openly available to modification as well. It does not necessarily mean that is free. Among others, some interesting issues regarding open source software and hardware that can be subject to discussion during the workshop are:
- Open source businesses models. How organization may create or adequate their business models in order to produce revenue and therefore value for stakeholders.
- Open APIs for open source cloud applications. Being cloud computing a trend in modern IT, having captured the attention of both researchers and practitioners alike, the impact of open source IT in cloud computing is relevant, especially through the use of open APIs.
- Ways to connect developers in an open source development environment. From the definition of an “open source” technology, it comes that collaboration between developers or teams of developers is the main force nurturing innovation within the open source IT field. How this collaboration do actually take place and how communication improvement between developers enhance open source projects development remains an open topic.
- The methodological aspects of open source development practice.
- Legal framework and regulations regarding the open source experience.
Among the many topics of interest regarding open source IT, the expectation toward this symposium was to discuss those with colleges and speakers, so learning and contributing of useful information about recent developments in this active field were possible.
II. Featured Presentations
Translation Sprint on the Mozilla Projects (Arky)
The development platform and development tools for Mozilla Firefox were presented, including the Gecko Firefox rendering web browser engine. Firefox Nightly was introduced, and the installation of this distribution along with production Firefox was suggested through profile management. Presentations aimed to promote the creation of a Mozilla developer’s community in Cape Town. Also, the possibility of language customization of Firefox applications was discussed.
ZeroReserve: The Bitcoin Reserve Bank (Koch, Rudiger)
A completely new framework for understanding money, trade and financial management is embodied within the proposal of the ZeroReserve was presented. Whether Bitcoins are “open source money”, “soft money” or a virtual monetary representation of trade remains an open issue. Emphasis was placed on the possibility of using Bitcoins and setting up an organized ZeroReserve as to gain financial independence for African nations, especially from Western countries controlling the main monetary organizations in the world and from excessive dependence upon the strongest currencies in the world, such as the United States dollar and the European euro. Currently, ZeroReserve is a prototype Linux application which does not require a banking gateway and implemented as a plug-in for Retroshare (an open source cross-platform, peer-to-peer, secure and decentralised communication platform).
Mozilla: Firefox OS Tools and Apps (Arky)
Brand new features recently implemented in Mozilla Firefox were presented, including an overview of the Firefox web developer tools, overview of the Firefox OS concept and design, the Firefox OS phone simulator (currently an add-on to the browser) and a physical demonstration of the phone. Discussion on how to deploy apps to the Firefox phone and the features included in it closed the presentation sessions.
III. Learning Experience
Also, we found useful the comparison between the web development tools provided by Google Chrome and Mozilla Firefox, so advantages and disadvantages in each case were identified, because this improves our knowledge for our future development tasks.
In the case of the Bitcoin ZeroReserve, learning about this new monetary paradigm was indeed useful, but we noted as a possible software development opportunity using the virtual currency to support a pricing methodology regarding the informational content of financial products and services. Currently in the field of Computational Finance, increasing attention is being put on the information flows within the system as a way for explaining the behaviour of the financial variables, departing from the traditional computational finance applications relying either on artificial intelligence, game-theoretical algorithms, or on agent-based simulation testbeds.
It is known that the Achilles heel of Bitcoin is the reliability of exchanges performed using the soft money and the weaknesses of the associated price discovery mechanism. Indeed, one issue that is not noted here is that in our opinion, excessive emphasis is placed on banks, as the opposing theoretical model to the ZeroReserve. This might be misleading, as banks make just a fraction of the financial institutions present in the system. Anyway, most likely, Bitcoin will not be completed and functional until a proper billing engine is defined and implemented. And for implementation, usability tests will more likely require a complete project specification by themselves, because they will be at the core of the project success.
IV. Relation to Project
Our project relates to an implementation of a simple communication application which based on an Asterisk script, enables users to make calls from mesh potatoes to PSTN or GSM peers, using a VoIP based architecture. This means that almost the entire project is based on open source software and hardware, so the topic reviewed during the symposium closely relate to our project objectives. This is especially true in the case of the Firefox OS phone, which may be analysed as an alternative for hosting a client receiving calls from the mesh potato users. This would eliminate the excessive dependence of smartphones for actual application deployment.
V. Future Activities
Our intention is to continue working on our project at least for the rest of 2013, with further tasks possibly relating to planning a fully mesh communication network based on extensive use of Session Initiation Protocol (SIP) clients (e.g. Elastix or Trixbox) and closely following any new developments of the Firefox OS / Firefox phone as we presume this will be dominant technologies in the future
CPUT FOSHS ’13 [1, 2] has come and gone. Many thanks to our presenters, sponsors and colleagues that helped organize it. On day 1, Sydwell Williams gave an interesting talk on a bitcoin application for the CPUT and did some demonstrations. I must say it was another successful event. We had international speakers this year, and a number of colleagues and students came to the event. We also had the head of the institutional repository, CPUT and a number of other attendees outside the CPUT. Here are some tweets on it.
Mozilla: Firefox OS Tools and Apps by Arky
Zero Reserve by Rüdiger Koch
Other Video Presentations::
Firefox OS Tools/Translation Sprint by Arky (1)
Firefox OS Tools/Translation Sprint by Arky (2)
Firefox OS Tools/Translation Sprint by Arky (3)
Some Pictures from the Event
Commuting is important due to its ubiquity and consumption of time, money and energy.
OSS Watch’s legal officer Rowan Wilson was fortunate enough to see Joel Spolsky of StackExchange speak at Open World Forum about the Cultural Anthropology of StackOverflow. I wasn’t able to attend, but there’s an longer version of the talk available on YouTube.
Click here to view the embedded video.
Joel presents some interesting points about how the design of a piece of software affects the way its users behave – this is crucial in this context as the software we’re talking about is a communication tool, so its design affects how a community communicates.
He describes the importance of first impressions
The first impression on StackOverflow is that, if you’re a programmer, you get that these are all programmer questions… If you’re not a programmer you don’t understand a single thing and you leave.
This seems like a hostile and exclusive approach to community management – usually when we talk about building open development communities we talk about being welcoming to ensure we’re not putting off potentially valuable contributions. However, the goal of StackOverflow and similar StackExchange websites is to get expert answers to difficult questions – people who don’t understand the subject will only create noise, so putting them off from engaging early increases the site’s usefulness.
The talk is an hour long so I’ll leave you to watch the whole thing rather than picking it apart here, but it’s a really good overview of a very successful online support community, and discusses some ideas which might go against the conventional wisdom of community management.
There was an article written by Pearl Lee in the Saturday edition of The Straits Times. The title is “Packaging key to effective e-learning”.
You can get the online version from this link:
Briefly, the three main challenges universities faced today are:
The steady rise in research co-publications across countries has intensified the competition that academics around the world face.
2. New Ways To Teach
Teaching styles have to be adapted to suit the profiles of today’s consumers. And that means more online learning, for one.
3. Decreased Public Funding
In some countries, such as Australia, the government is reducing financial support, and this may affect the quality or research at universities.
IQ test does not tell everything about a child
This is the title of a short article written by Dudley Au in Today newspaper. You can find the full article at this link:
The important points I gathered from this article are:
1. We should not rely too much on IQ test.
2. When we start to label some children, they may become what you have labelled them.
WeBWorK::Rochester::2013, held at the University of Rochester from Friday October 4 through Monday October 7, was dedicated to
- putting final touches on release/2.8 of WeBWorK
- integrating the new user interface created by Peter Staab into the development branch of WeBWorK
A small group of experienced WeBWorK developers participated in this code camp: Davide Cervone, David Gage, Mike Gage, Geoff Goehle, John Jones and Peter Staab. My thanks to all of them for their contributions to a very successful weekend. My thanks also to Louise Wingrove for organizing the lodging and meals for participants.
This camp and previous code camps are supported by the NSF through a national dissemination grant to the MAA. ( link to www.nsf.gov once it is back up and running again. :-) )
The first outcome of the camp is an updated release/2.8 which we plan to merge with the master branch on December 1, 2013. We combined the original release/2.8 with most of the fixes and small features which have been submitted to the develop branch over the last three months. Both release/2.8 and the develop branch have been running smoothly under moderate course loads on the MAA testcourse site and on the hosted2 site at the University of Rochester. The activity devoted to release/2.8 over the next few months will be responding to bug fix requests, minor adjustments of features and general polishing of the instructor experience. Very little has changed in the student interface and there have been very few requests for changes in this aspect of WeBWorK. While not specifically adapted to mobile devices the student view of WeBWorK works acceptably well on iPhones, iPads and Android mobile devices.
Features of release/2.8 are listed on the wiki at: http://webwork.maa.org/wiki/Release_notes_for_WeBWorK_2.8
(You can type release/ 2.8 into the search box of the wiki to find it.)
You can also view all of the work involved in creating release/2.8, step by step,
viewing the commits page on github. https://github.com/openwebwork/webwork2/pull/182/commi
The most recent commits are at the bottom.
The will be more exposition about new features in release/2.8 (and some under advertised features of release/2.7) in subsequent blog posts.
It should be noted that LibraryBrowser1, although it has not changed its name, has received substantial improvements in release/2.8 from the work of John Jones. In general it should be much faster because some of the ajax calls used in librarybrowsers2&3 have been used to speed up rendering of individual problems on a library page. When enabled, the library page also allows for the easy tagging of library problems. (see http://webwork-jj.blogspot.com/2013/07/webwork-opl-workshop-charlottesville-va.html
for more details)
instructor interface, largely created by Peter Staab at Fitchburg State University, which has been merged into the develop branch of WeBWorK. This interface provides instructors with behavior that feels more like a "google app" instead of the form based interface that we have been used to since the mid 2000's. Peter began work on this project during WeBWorK::Rochester::2012 held a year ago June.
One of the early outcomes was "ClasslistEditor3" which has been available as an option in both release/2.7 and release/2.8. The current version includes the ClasslistManager (renamed and improved from ClasslistEditor3) and HomeworkManager which combines the duties of the Library Browsers, the HomeworkSetsEditors(1&2) and the Instructor tools page. The HomeworkManager's library browsing functions are built on the experience gained from the prototype LibraryBrowser2 and LibraryBrowser3 which were written by David Gage. All of these tools have been available for testing in their embryo form on previous releases, but they have now progressed to the point where they can usefully speed up many standard instructor tasks.
WeBWorK::Rochester::2013 allowed Peter to explain in person his work and his vision to several of the other core WeBWorK developers. (Peter has not been able to attend any of the code camps since last June.) We now have a clearer idea of what has to be done to finish the transition. We were able to make significant strides in improving reliability during the code camp itself but much more remains to be done.
The net effect of using ClasslistManager and HomeworkManager is that instructors can manipulate classlists -- add students, change passwords, or homework assignments -- create, assign, etc. immediately. The updates of these changes to the back end server are done asynchronously and are invisible to the user.
At the moment the develop branch is fairly wild. Some actions don't behave as you expect or as they should; there are many features of the older editors and browsers that have not yet been implemented in the new interface. In some cases things that work fine on small sets or classes slow down drastically when the scale is increased. We expect that it will take many months before this develop branch is ready for use on a regular basis.
On the upside -- the student interface is not affected, and so far at least there is no affect on stored data. Since the old editors are still available one can simply switch to them for features that are not yet implemented and then switch back to the new "managers" for their added convenience on tasks where they work well.
For those helping with development:
- submit bug fixes and small feature tweaks to the release/2.8 branch
- submit new features to the develop branch
In all cases make sure that your are in sync with the branch you will submit to before you
send a pull request. If the commit does not merge cleanly it will be returned for more work before it is even reviewed.
You need sound coding skills to create good software, but the success of an open source project can also depend on something much less glamorous: your choice of software license.
Last week I spoke to Paul Rubens of CIO.com about the issues that need to be considered when deciding which licence to use when releasing your code, including why a licence is necessary, the varieties of Free and Open Source Software licences, and how you provide licenses for the non-software parts of your project.
You can read the full article at CIO.com.
Once again I have the (incredible) opportunity to be at Hacker School playing around with my “edupsych for hackers” material… I’ve never revised and re-delivered a talk so often, and it’s good to be forced to see how this material improves with age and experience.
Differences between this and the PyCon Toronto version include the cutting-out of Bloom’s Taxonomy (it’s cool, just not high-priority), the separation of nearly all the Felder-Silverman Engineering Learning Styles material to a separate workshop for tomorrow, and dropping the emphasis on (making fun of) academia’s complicated verbiage, because… that’s not the point.
The slidedeck is at http://bit.ly/hackerschool-f2013 and embedded below. Someday, I want to get this talk taped and transcribed.
Free Open Source and Hardware Symposium (FOSHS ’13) is aimed at bringing
together industry, developers, educators, the community and any other
interested parties to discuss open source, open hardware, open web and
academic/industry partnerships. We’ll be running the two-day workshop at
the Department of Information Technology, Cape Peninsula University of
Technology on October 10-11, 2013.
The first day of the event is mainly for staff members and students to
interact with one another, and more importantly to showcase the various
projects they are working on.
Translation Sprint on the Mozilla Projects
Presenter: Arky; Technologist, Programmer and an Artist
Arky’s primary goal is have face time with the symposium participants who could help kick-start a Mozilla community in Cape Town. He will also do an African languages localization workshop (30 minutes introduction talk and 1hour translation sprint).
ZeroReserve: The Bitcoin Reserve Bank
Presenter: Rudiger Koch; CTO and founder of the first African Bitcoin Exchange
Rüdiger will be presenting ZeroReserve. If Africans could adopt Bitcoin and ZR for trade, it is in African’s hands to redefine the power map of the financial world. You can put the FED back in place and do all the world a favor, and most of all yourselves.
“You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.”
Mozilla: Firefox OS Tools and Apps
Presenter: Arky; Technologist, Programmer and an Artist
Arky will be returning to introduce new features in Mozilla. The features include Firefox Web Developer tools, Firefox OS design and developing Apps for Firefox OS phone.
Practical Guidelines for Starting an Institutional Repository (IR)
Presenter: Hilton Gibson; Linux System Administrator, Stellenbosch University.
Friday 16:00- 1700
These guidelines are targeted at academic institutions in developing countries world wide who want to start an open access research repository and who want to know in detail what is required and how to do it step-by-step. The focus during development of the system has been long term digital preservation, security, stability and interoperability on the internet, using open systems. This soup-to-nuts overview may be particularly useful for those involved in the early stages of planning for an institutional repository.
OpenGIS Web Map Tiling Service (WMTS) is becoming the standard used for distributing raster maps to the web and mobile applications, cell-phones, tablets as well as desktop software.
Monmouthpedia was the first Wikipedia project to embrace a whole town—specifically, the Welsh town of Monmouth (pron.: /ˈmɒnməθ/ MON-məth; Welsh: Trefynwy).
Geographical Information System (GIS) applications have been existed since the early 1960s, but evidence suggests that adoption of GIS technologies still remains relatively low in many sectors.
This presentation will reflect on my experience of using the OSGeo Live system (versions 5.0 and 6.0) for delivering practical class teaching of GIS at Masters level.
The impressive list of OSGeo Projects  show the necessity to develop OpenSource software. Behind all lines of code, there is the work of one developer.
Realtime geospatial data has become more prevalent and relevant in the areas of disaster response, crisis identification, government operations, and business branding and engagement.
OpenLayers 3 enables a huge range of new web mapping functionality.