|AOL TV||2 Comments|
Nov 25th 2007 6:03PM Cry a little more, guys. Have you ever considered the fact that you might be taking the battlegrounds too seriously, as opposed to the other players not taking it seriously enough? After all, it's just a game. No, more than that, it's a game within a game, which means people should take it even less seriously, but those who take WoW too seriously tend to take the battlegrounds even more seriously.
Don't like that level 62 guy in your game of Alterac Valley? Too bad. That rogue teammate decided to stay in stealth and watch you die? Too bad. That druid got the flag but didn't shapeshift to get out of snares? Too bad. It's their game as much as it is yours. Live with it.
Nov 17th 2007 4:29AM I'm just looking for something a bit more editorial. I mean, granted, you're all journalists, blah, blah, blah, but if you're going to have a wacky picture to go with a serious news story, there is absolutely nothing wrong with actually going out and taking a stand on a subject. After all, it's not as though you guys write the news most of the time; it's just digested from other sites, with the exception of those occasions where you get an interview with someone or another, at which point you can have all the fun you want with the questions, such as asking the Hellgate: London people, "What makes your game different than, say, a post-apocalyptic version of Diablo-meets-MDK?"
Honestly. It's the internet, and being a news-aggregator essentially takes you out of the running for a Pulitzer, so you might as well have some fun. However, if you decided to actually get into real criticism, you wouldn't find much competition out there, and the world is in need of more interpreters, not necessarily reporters, or (in the case of news-aggregators) repeaters; people with points of view that may or may not be the same as individual readers', but at the same time can spark debate amongst them via the comment system that is so often left blank, except for in these situations of potential free property.
Nov 13th 2007 2:44PM Oh, let's not forget that he probably didn't pay for that copy of Excel that miraculously opened on his Xbox after hitting level 10. Forget the cops, Microsoft's gonna be on this guy like white on rice.
Nov 10th 2007 12:52AM @21 - You are a shining, perfect example of why it is that people roll alts instead of grinding instances night after night, because it's not any fun playing with people who have this idea of how to play "properly." It's a game, and I find it to be a terribly fun game when I'm out doing solo PvE. However, instances tend to bring out the worst in a lot of people, generally hardcore level 70's who think they're the only ones entitled to new content.
As such, I don't much care whether they go with just the one hero class, or whether they go with multiple hero classes, because the first people to play them aren't going to be laid-back casual players, who I find make the game palatable to my tastes. No, instead, it's going to be the instance-junkies, and so I know that for several months, I'll be able to count on every death knight I see calling me 'noob' and telling me to learn to play my class properly.
Oct 29th 2007 8:17AM Dear conspiracy-theorist hippie credit-phobes: According to the Department of the Treasury, companies most certainly can refuse cash for transactions. I quote (at length) from http://www.treas.gov/education/faq/currency/legal-tender.shtml#q1
...the Coinage Act of 1965, specifically Section 31 U.S.C. 5103, entitled "Legal tender," which states: "United States coins and currency (including Federal reserve notes and circulating notes of Federal reserve banks and national banks) are legal tender for all debts, public charges, taxes, and dues."
This statute means that all United States money as identified above are a valid and legal offer of payment for debts when tendered to a creditor. There is, however, no Federal statute mandating that a private business, a person or an organization must accept currency or coins as for payment for goods and/or services. Private businesses are free to develop their own policies on whether or not to accept cash unless there is a State law which says otherwise. For example, a bus line may prohibit payment of fares in pennies or dollar bills. In addition, movie theaters, convenience stores and gas stations may refuse to accept large denomination currency (usually notes above $20) as a matter of policy.
So, yeah, Apple's not doing anything more illegal than the local Starbucks refusing to crack a $100 bill for your Frappuccino.
Oct 7th 2007 4:27AM Actually, the latest integrated Intel chipsets really aren't that bad, as everything from the X3000 and up are (pixel) Shader Model 3.0 compliant, which is more than I can say for the AGP card I've got in my Windows box right now. Unfortunately, they haven't been implemented on the Mac platform as yet, which basically leaves the Mac mini and the Macbook out in the cold, as far as games go. The dedicated graphics chipsets on the Macbook Pro and iMac (courtesy of Nvidia and ATI) aren't half bad, unless you're one of those people who has to upgrade, upgrade, upgrade until you realize that any further upgrades you make will be marginal at best, with regard to performance increases. And if you're one of those people, you're probably better off rebuilding your beige box every twelve months, anyway.
But I digress.
Until Apple builds the GMA X3100 or X3500 series of Intel integrated chipsets into the Macbook and Mac mini motherboards, the entry-level Mac market will continue to be a joke for game publishers, as noted by the fact that the Mac releases of the current editions of Tiger Woods and Madden require a dedicated graphics chipset (and therefore an iMac, Macbook Pro, or Mac Pro). I can't say how much of this, though, is due to the fact that Tiger Woods and Madden are being implemented in OS X through Cider, as opposed to natively coding the games for OpenGL compatibility. This, of course, means that the games are feeding Direct3D code to an interpreter, which then translates the code into something OS X can understand, causing a fairly massive performance hit over coding the game for OpenGL compatibility in the first place, which I'm fairly certain is how Blizzard has been managing to get games to run on the Mac and be competitive with PC's with equivalent hardware.
I've said this before, of course, and it continues to be my opinion that developers' addiction to DirectX creates a never-ending cycle of late releases (if ever) to the Mac, thus diluting any potential that the Mac has as a gaming platform. While Cider is a step forward in decreasing the amount of time it takes to port a game to the Mac (from a year or more down to a few months), the necessary translation of code via Cider represents a draw on the CPU that would better be used for game functions, were the game initially coded for maximum compatibility, as noted by games from id Software and Blizzard.
In short, it's partially Apple's fault, which can partially be remedied sometime within the next several months with a significant chipset update. However, it's also the fault of the developers for worshipping at the dark altar that is DirectX. But at least it's got to make Xbox 360 ports easy to do, and we all know that's where the money is, at least for the *ahem* "hardcore gamer." ... Of course, we won't let the hardcore gamer look at hardware sales charts, because it will only make him sad. But perhaps that's a clue as to where gaming on the Mac might as well go. There's money to be had in casual gaming.
Oct 2nd 2007 7:40PM With regard to the Wireless-N problem, Blizzard's excuse is, "Well, if you would have waited until they'd established a standard for Wireless-N, we wouldn't be in this situation." Half of me agrees with them, half of me doesn't. Of course, at the moment, it's looking like there will never be a bona fide standard for N, which kind of makes the future more than a bit fuzzy for the early adopters.
Of course, yesterday, they rolled out voice chat to the remaining realms in a series of rolling restarts, which, as pointed out by larsiezwei, is causing massive amounts of lag. Blizzard's representatives in the official forums seem to be taking this less than seriously. I mean, when it comes to the slightest bit of Wrath of the Lich King news, their marketing department goes into overdrive and the press eats it up. New patch (technically, probably the voice chat, which preceded the patch) causes absurd amounts of lag? Not a news story, carry on, no comment.
Aug 2nd 2007 4:13PM @12: So, what you're saying, here, is that we should stop referring to the nation of Japan as 'Japan' because, like 'Japanimation', it begins with "Jap"?
Note: You can't have it both ways, so either the nation is Japan, or we're all going to have to start calling it Nippon, but that starts with "Nip," which is yet another derogatory term.
Jul 31st 2007 7:05PM "Hardly, my lord, it's just an eye. The gods saw fit to grace me with a spare."
Jun 20th 2007 5:01PM "Apple needs to give people more video card options. Even on the Mac Pro the options are slow, mediocre and $1650 workstation card.
Somebody please tell me why Apples can't use "regular" video cards."
The reason Apples can't use regular (off-the-shelf) video cards is likely due to the fact that they want to keep the amount of hardware that they have to support as small as possible. After all, when Nvidia doesn't write a proper driver for Vista, people have this tendency to blame Microsoft for breaking their video cards. And you're not going to find Nvidia or ATI jockeying for position to rewrite drivers for computers that hold 7.6% of the home computer market.
However, I will agree with several of the commenters that hardware is a major sticking point with the Apple lineup. The current set of Intel graphics processors are nice, but they lack in-hardware transform & lighting and support for shader model 3.0. The next iteration of Intel's graphics processors will support these, as well as a number of other features, and it's a lack of this chip that kept me from buying a Macbook after the last revision. Maybe next time around.
I do not agree, though, that Apple should necessarily make it any easier for people to change their graphics cards on, say, an iMac, as that's just inviting the sort of disaster that nearly befell me when I had to switch out the hard drive on my 12" iBook. To make larger amounts of hardware user-swappable would only hurt the design of the computers, and I can tell you for sure that Apple cares more for their design than for your desire to play Quake Wars on a two year-old computer.