I’ve been spending some time with Renoise recently, and this LADSPA plugin called Autotalent that acts as an auto-tuner / real-time pitch corrector. For a while I was playing around with it using pre-recorded samples, but the other night I thought it would be fun to try it out live.
Unfortunately I couldn’t get my motherboard to work with any of my (analog) microphones, so I pulled out an old Logitech USB microphone I had for my Playstation 2. Linux recognized it, and I was able to record, but then I discovered that Renoise has an unusual shortcoming: it does not support monaural input sources. Well, this is Linux.. there must be a way to make this work, right?
Sure enough, there is! I dunno if this was the smartest way to get it working, but I got Renoise to recognize the mic as a stereo microphone if I configured a virtual pcm device in in ALSA, ran a Jack server with this virtual pcm as the capture device, and then reconfigured Renoise to use Jack instead of ALSA. The latency was noticeable, but still good enough to allow me to do my own T-Pain and Wallpaper renditions.
Here’s how it’s done:
$ arecord --list-devices
card 2: Microphone [Logitech USB Microphone], device 0: USB Audio [USB Audio]
Subdevice #0: subdevice #0
$ cat /etc/asound.conf
$ sudo jackd -d alsa -P hw:0,0 -C stereo_capture
bit-tech got all excited by some statements from AMD’s head of GPU developer relations where he advocates killing the graphics API.
This is great for console gaming, in fact it’s already the norm, but its a terrible idea for PC gaming.
A graphics programmer working on PC titles doesn’t want to have to code a game using the display list language of a bazillion different graphics cards. At the very least you would have to support the “common” set of cards from nVidia,
ATI AMD, and Intel. In order to make your game marketable, you would end up essentially recreating DirectX or OpenGL yourself, negating the benefit of eliminating these APIs in the first place.
On a console, especially if you are a first party developer, you want access to the display list language because that’s the most efficient way to make the best use of the hardware, hands down. And you only have one target platform to worry about, so it makes sense. I don’t know about Xbox, but on PSP and PS3 the only way to push the hardware to its limits is to code display list generation commands directly into your game and skip the graphics APIs all together. On PS3 you need to do this anyways in order to be able to generate graphics from the SPUs.
I would be surprised if Sony isn’t taking this approach on PSP2 also. It’s good for the console manufacturer to lock developers into a platform–and I suspect that might be where the AMD guy is coming from also.
But he’s wrong to advocate this for PCs. It would probably stifle innovation more than advance it (as he claims it would), because now hardware vendors would feel compelled to make their opcodes backwards-compatible for free of breaking compatibility with older, popular games. You would recreate the same problem we have now with the “venerable” x86 platform.
Unless he’s talking about reconfigurable ISAs on the GPU. That could be interesting.
I have 4 hard drives in my system and I’ve been running them with RAID 1 pairs tied together using LVM. The first couple GB of each are set aside for /boot and swap space.
I was running Debian and I wanted to try Ubuntu, but Ubuntu Desktop out-of-box doesn’t support this kind of configuration. Ubuntu Server advertises LVM support, but for whatever reason didn’t recognize my configuration. I probably could have recreated the same configuration w/ Ubuntu Server’s installer, but I didn’t want to risk losing my data and then discovering it doesn’t actually support it.
Finally after some trial and error I figured out the Ubuntu Alternate distribution has LVM and software RAID support if you run it in “expert” mode. It was a bit of a pain to manually walk it through the installer, but eventually it recognized my existing drive configuration. Definitely not the “friendly” Ubuntu installation experience I was hoping for, but it worked.
When the install was done it left me with a console-only bare-bones install. But since I had the Ubuntu Desktop CD handy, I popped it in the drive and ran:
apt-get install ubuntu-desktop
…and away it went.
I’ve decided to put the foundation I wrote for Anytime Golf up for sale. I’m calling it the Bork3D Game Engine. I think I can carve out a niche in the iPhone engine space for programmer-types with high expectations of the hardware that want a solid foundation on which to build their game. There are other options out there but I believe most game developers look at those and fear (rightfully so, I hear) that those other options won’t meet their performance criteria.
Selling a game engine is an interesting thing. I think the Unity folks have a good business model: they’re clearly targeting non-programmer types who want to put together something 3D quickly. Torque has their iTGB offering that’s similar. There’s a good deal of money to be made there. Unfortunately it doesn’t produce great games. (Oh no he didn’t!) OK in fairness these engines produce awesome games on PC. They are awesome PC engines. I’m a licensed Torque user and I love it. They’re just not mobile engines.
The Bork3D Game Engine was built for mobile platforms. It actually has it’s roots in Rude Engine that I wrote for Vector Blaster, and can run on Pocket PC, Symbian and N-Gage in addition to iPhone. But it isn’t a complete game engine by any stretch of the imagination. If you want in-game content creation tools and a scripting language please leave. However if you want to build in-game content creation tools and install your favorite scripting language, c’mon in!
So what do you get?
- All the source code
- OpenGL ES abstraction layer
- Debug-rendering API
- Component-oriented game object system
- High-performance static and boned mesh rendering system w/ tool pipeline for 3dsmax and Maya
- Decorator system for rendering billboards
- Texture manager w/ tool pipeline
- Runtime “tweaker” for changing game variables via a web browser
- User interface widgets w/ abstractions for handling iPhone user input
- Font renderer w/ tool pipeline for generating fonts (supports unicode)
- In-game profiler
- Audio system for sound effects and background music
- Integration with the Bullet Physics SDK
- Unit test framework
And what does it cost? I’m selling it for only $49 (if you’re indy). Fourty-nine bucks. That’s a lot of code for not much. I’m probably crazy. I guess I’m softening as I grow older, considering this is 1/15th what I was selling Rude Engine for a few years ago.
See bork3d.com for more information.
There are also threads on TouchArcade and iphonedevsdk.com about the engine.
In the version control world, I’m a big fan of perforce. I first became a fan when I saw how fast it was–it can perform checkouts of gigabyte-sized repositories extremely fast. I also really liked the perforce workflow–changelists are the “normal” way of working. However I started to love perforce when I began playing with it’s branching capabilities. It’s ridiculously easy to create and integrate branches, and they have wonderful tools for helping you understand branches.
At my new workplace we’re on svn, and while I wouldn’t say I “hate” it… That’s a pretty strong word… I am “very displeased.”
Speed isn’t so much of an issue here because we’re only dealing with megabyte-sized repositories, so a minute or two to checkout ~100mb or so isn’t too painful. The svn workflow has taken some getting used too. Nobody works with svn changelists–there are no good tools for managing changelists so it’s wasted effort. It’s easier to just do another checkout of the repository and work in there. Since our repository here is relatively small this is no big deal. (I couldn’t imagine what it would be like with a larger repository! Yikes.)
Branching in svn is downright painful. I suppose if you had nothing but text files running on one operating system and everything was in one flat directory structure you might say, “What are you talking about? It’s so easy!” but so far that hasn’t been my experience. Here are the problems I’ve encountered:
- Merging subdirectories instead of merging from the root. Apparently this is a known “no-no” according to the Subversion book. I made the “mistake” of merging a subdirectory a few revisions back and now whenever I merge from the root it wants to re-merge all the files in those subdirectories every single time. It marks their properties as modified even though they’re not being changed. Maybe this is just svnmerge.py being diligent but it’s annoying to see this massive list of files every time I merge, when really the “meat” of the merge is in just a handful of separate files. Weak sauce.
- Merging eol-style properties. There’s a known bug in svn 1.5.0-1.5.4 where if you make eol-style changes to a branch and you try and merge that to another branch you may get a “inconsistent newlines in /tmp/tmp” error. I encountered this and the only way I could figure out how to get around it was to upgrade to svn 1.5.6, make the eol-style changes in the branch, commit, then perform the merge. I then had to downgrade back to svn 1.5.1 because 1.5.6 couldn’t deal with the branched subdirectories I mentioned previous. I’d see an “Error reading spooled REPORT request response” message when it got to those subdirectories that had been previously merged.
- Merging inconsistent newlines. Really? This problem arrises because svn stores on the server whatever newlines you give it. I can’t think of any reason why the svn server would not store files as ‘eol-style=native’ on the server by default. It seems like storing the client’s newline format on the server is an exceptional case, not the common case. It’s irritating to have to merge newline inconsistencies.
I miss perforce. Somebody needs to develop “poorforce”, a cheap version of perforce. 🙂
A lot of programs can’t deal with the “smart quotes” gcc prints in it’s errors. If you have one program calling gcc and parsing the output, you might see something like:
foo.h:45: error: expected âââ}âââ at end of output
One fix I’ve found for this is to change the locale gcc uses for localization. You can do this using the LC_ALL environment variable (see the gcc man page). Setting its value to “C” normalizes the output to standard ASCII. In bash that’s:
One feature I’ve come to rely on in Visual Studio is it selects the currently active file in the project explorer. By default, SlickEdit doesn’t have this functionality, but you can add it yourself. I found some kindred spirits on the SlickEdit forums that wanted the same behavior. Just add the code on the first page for showInProject() to a new .e file, then append the code on the second page for _switchbuf_auto_showInProject() to the bottom, load the .e file from the Macro -> Load Module menu, and you’re set!
This is the result of years of rigorous interviews while I was “embedded” with game programmers from many different backgrounds. I found that SPU (the PS3 coprocessor instruction set) programmers by far have the largest ego, but not necessarily the most desirable job. Shader programmers have the perceived “coolest” jobs in the industry but don’t have quite the same ego as SPU programmers. PS3 game programmers believe they are better than Xbox 360 programmers, but Xbox 360 programmers have a more desirable job because their games reach a wider audience. PSP game programmers are a bit of an anomaly because they believe the work they do is very challenging, but no one in the industry particularly wants to make PSP games. Mobile, Flash, and iPhone programmers are at the bottom of both ends of the spectrum: their jobs are neither desirable or carry much ego.