0
\$\begingroup\$

The page is assigning $gid to the url paramater gid. It then reads an XML file that looks like this:

<games> <game> <name>name 1</name> <appID>1234</appID> </game> <game> <name>name 2</name> <appID>5678</appID> </game> </games> 

The code I am using works correctly but takes a second to load because there are so many <game> elements on the list. Is there are more efficient way to go about foreach? Here is the code:

$gid = (int) $_GET['gid']; $gamespage = simplexml_load_file("http://gamepage.com/games?xml=1"); $games = $gamespage->games; foreach ($games->game as $game) { if ($gid == $game->appID) { $appid = $game->appID; $gamename = $game->name; } } echo $gid, "<br />"; echo $appid, "<br />"; echo $gamename; 
\$\endgroup\$
4
  • 3
    \$\begingroup\$I would suggest you try to get this solution working instead.\$\endgroup\$
    – Tim Cooper
    CommentedAug 5, 2012 at 1:58
  • 4
    \$\begingroup\$It's probably not slow because of the foreach loop, but because you're fetching all of the data from a remote server every single time.\$\endgroup\$
    – icktoofay
    CommentedAug 5, 2012 at 1:59
  • \$\begingroup\$Profile your script. See how long it takes to fetch the data, and how long it takes to loop through the data. If you see the fetching is taking the bulk of the time, cache the data for x minutes rather than fetching on every request.\$\endgroup\$
    – tigrang
    CommentedAug 5, 2012 at 2:18
  • \$\begingroup\$I agree with icktoofay and Tim Cooper. XPATH and downloading a "cached" version would speed this up drastically. Or, if you don't want to use XPATH for some reason, then you can do as guillermoandrae stated and add a break in your if statement. Though I would suggest the first two solutions rather than the third. Otherwise this is as efficient as it gets.\$\endgroup\$
    – mseancole
    CommentedAug 6, 2012 at 13:08

2 Answers 2

3
\$\begingroup\$

Add a break to that if statement to stop looping once you've found the correct node.

\$\endgroup\$
    0
    \$\begingroup\$

    Probably the only thing you can do to make it faster is to limit the number of results returned at one time. Could you maybe use pagination or something? Is there a reason you need to get all of them at once? Even if it was coming from a database, if there were enough records, it would take a couple seconds to get ALL of them. I'd do, for example, 50 per page.

    \$\endgroup\$