Sorting OpenAPI 3 Specs by Paths

OpenAPI is a the new emerging standard in API definitions and documentations.

It’s based on a Specification File structured in Json, and a few editors and viewer tools are popping up here and there.

Unfortunately something that seems pretty inconsistent is the sorting of the paths in the viewer against the editor. The specifications leave to the writer to decide in which order the API Paths should be sorted, but most of the editors doesn’t allow to reorder the paths inside of the Json file.

So, I made this small-and-ugly script to sort the PATHS inside of an OpenAPI json file alphabetically, which seems a pretty practical approach even in documentations.

It reads a file called openapi.json and writes out a file called openapi_sorted.json.

Feel free to use it as you like.


## Small tool to sort the paths inside an OpenAPI definition file
## Taken from

$mainObject = json_decode(file_get_contents("openapi.json"));

$paths = get_object_vars($mainObject->paths);

$mainObject->paths = new StdClass();

foreach ($paths as $eachPath => $eachValue) {
	$mainObject->paths->$eachPath = $eachValue;

file_put_contents("openapi_sorted.json", json_encode($mainObject,


In gCloud Storage, our Storage-as-a-Service system, we developed some years ago some chain technologies that allowed us to expand dynamically the features of the Storage subsystem allowing it to translate incoming or outgoing files.

Some while ago we developed a chain that allows our users to securely store a file by ciphering it when it enters in the system and decipher it when it’s fetched, without our party saving the password.

After some thinking we decided to embrace already existing technologies for the purpose, and we decided to rely on openssl for the purpose.

So we had to wrap some code that was able to interact with a spawned openssl process. We did some try-and-guess and surely we did our research on google. After various attempts we found this code that proved to be pretty reliable:

We tried first on our Mac OS machines, then on our FreeBSD server and it worked flawlessly for a couple of years. Recently one of our customer asked for a on-premises installation of a stripped-down clone of gCloud Storage, that had to run on Linux (CentOS if that’s relevant). We were pretty confident that everything would go smoothly but that wasn’t the case. When the system went live we found out that when deciphering the files it would lose some ending blocks.

Long story short we found that on Linux a child process can finish while leaving data still in the stdout buffer while – apparently – it can’t on FreeBSD.

The code we adopted had a specific control to make sure that it wasn’t trying to interact with a dead process. Specifically:

if (!is_resource($process)) break;

was the guilty portion of the code. What was happening was that openssl was closing, the code was detecting it and bailing out before fetching the whole stdout/stderr.

So in the end we came out with this:

public function procOpenHandler($command = '', $stdin = '', $maxExecutionTime = 30) {

    $timeLimit = (time() + $maxExecutionTime);

    $descriptorSpec = array(
        0 => array("pipe", "r"),
        1 => array('pipe', 'w'),
        2 => array('pipe', 'w')

    $pipes = array();

    $response = new stdClass();
    $response->status = TRUE;
    $response->stdOut = '';
    $response->stdErr = '';
    $response->exitCode = '';

    $process = proc_open($command, $descriptorSpec, $pipes);
    if (!$process) {
        // could not exec command
        $response->status = FALSE;
        return $response;

    $txOff = 0;
    $txLen = strlen($stdin);
    $stdoutDone = FALSE;
    $stderrDone = FALSE;

    // Make stdin/stdout/stderr non-blocking
    stream_set_blocking($pipes[0], 0);
    stream_set_blocking($pipes[1], 0);
    stream_set_blocking($pipes[2], 0);

    if ($txLen == 0) {

    while (TRUE) {

        if (time() > $timeLimit) {
            // max execution time reached
            // echo 'MAX EXECUTION TIME REACHED'; die;
            $response->status = FALSE;

        $rx = array(); // The program's stdout/stderr

        if (!$stdoutDone) {
            $rx[] = $pipes[1];

        if (!$stderrDone) {
            $rx[] = $pipes[2];

        $tx = array(); // The program's stdin

        if ($txOff < $txLen) {
              $tx[] = $pipes[0];
          $ex = NULL;
          stream_select($rx, $tx, $ex, NULL, NULL); // Block til r/w possible
          if (!empty($tx)) {
              $txRet = fwrite($pipes[0], substr($stdin, $txOff, 8192));
              if ($txRet !== FALSE) {
                  $txOff += $txRet;
              if ($txOff >= $txLen) {

        foreach ($rx as $r) {

            if ($r == $pipes[1]) {

                $response->stdOut .= fread($pipes[1], 8192);

                if (feof($pipes[1])) {

                    $stdoutDone = TRUE;
            } else if ($r == $pipes[2]) {

                $response->stdErr .= fread($pipes[2], 8192);

                if (feof($pipes[2])) {

                    $stderrDone = TRUE;
        if (!is_resource($process)) {
            $txOff = $txLen;

        $processStatus = proc_get_status($process);
        if (array_key_exists('running', $processStatus) && !$processStatus['running']) {
            $txOff = $txLen;

        if ($txOff >= $txLen && $stdoutDone && $stderrDone) {

    // Ok - close process (if still running)
    $response->exitCode = @proc_close($process);

    return $response;

Have Fun! ūüėČ

PHP: CURL – Upload di un file in POST, dietro proxy.

PHPRecentemente mi sono trovato a dovere simulare il comportamendo di un browser interagendo con un web server in PHP. L’obiettivo dello script PHP che si stava facendo era quello di simulare il caricamento in POST di un documento. L’obiettivo √® facilmente raggiungibile in PHP tramite le librerie CURL. Cercando su internet si trovano numerosi esempi da cui se ne deduce che l’uso-tipo √® il seguente:

/* L'url a cui inviare il POST */
$request_url = "";

/* Indico a CURL quale sia il file da inviare:
 * nel form c'è un <input type="file" name="FileUP" /> */
$post_params['FileUP'] = "@/home/io/TestPdf2.pdf";

/* Nel Form ho un qualsivoglia altro parametro da passare
 * in post */
$post_params['AltroParametro'] = "Ezekiel";

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_params);
curl_setopt($ch, CURLOPT_VERBOSE, TRUE);
$result = curl_exec($ch);

Il trucco sta nello specificare con la @ davanti il nome del file da inviare: in questo modo CURL capisce che parliamo di un file e si regola di conseguenza. Tutto ha funzionato bene fino a che non ho provato ad eseguire lo stesso codice da dietro SQUID in modalit√† trasparente, che ha iniziato a rispondermi “Invalid request”. Dopo ore ed ore di tentativi (inutili), ho scoperto che il punto √® molto semplice: CURL utilizza un’intestazione HTTP non supportata da Squid, la “Expect: 100-continue”. Per impedire questo comportamento √® sufficiente aggiungere un’opzione:

curl_setopt($ch, CURLOPT_HTTPHEADER, array('Expect:'));

E tutto magicamente funzionerà.