Import workflows with PowerShell

So I worked on a powershell script to export all of my workflow of a vCO server and then import them back in to another

I have the export works very well!

That said, can not find the code to import-I was wondering if anyone knows how to...

$body = "file = $(get-content C:\workflows\ConfigureNTP.zip-raw).

Invoke-RestMethod - URi $categoryuri - POST method - body $body - Credential $cred

-ContentType "application/zip" - Headers @{'Content-Disposition '=' form - data'; "" name "=' configureNTP '; "" filename "=" configureNTP.zip ";}

Receive a status 415 - server denied the request because the request entity is in a format not supported by the resource for the requested method.

I have no idea - but I'll try I was wondering if someone do this before using the Invoke-RestMethod cmdlet

It worked on my PC

See the attached script & workflow example

.\Import-workflow.ps1 - vcoHost xx.xx.xx.xx - user vcoadmin-pass categoryId vcoadmin - 8a978cc84509e09c01450a18f9bc0013-C:\Temp\Content_byte1.workflow files

Tags: VMware

Similar Questions

  • vCAC 6: Workflow with PowerShell error

    Try to call a simple PowerShell script that adds a DNS record after the device is put into service.

    The script works fine when I run it from the server, but I get the following error when I call vCAC.


    Workflow "WFStubMachineProvisioned" failed with the following exception:

    The term "Add-DnsServerResourceRecordA" is not recognized under the name of a cmdlet, function, script file, or an executable program.

    Check the spelling of the name, or if a path has been included, make sure the path is correct, and then try again.



    Make sure that the module or snap-in is available on the server of dem

    Sent from my iPhone

  • Workflow with the wizard

    I have a question about workflow, with a wizard to help me select the images to be used for a show.  Here's my current workflow.

    (1) mounting base and adjustment of images.  Reform of the obvious losers.

    (2) export images so that my assistant can rate/slaughter them.

    (3) import on sides of my assistant and use it to help determine what are the images to use.

    (4) show to create images.

    I am strugling with the best way to manage 2-3.  My assistant does not use the same computer that I do so using the same LR catalog and note using the LR library module does not work.  I would like to be able to publish in a shared directory, my assistant a tool allows to note the images here and then import it those transparent rear side in LR.  I think the API to publish has a facility to import * of * data - it can download the comments of photo sharing sites.  Can he do this?  This feature is already implemented?  Y at - it a plugin that does this?

    Does anyone have an idea of how improve this workflow?

    Thank you!

    David

    I have a LR catalog and 1000's of photos, but I have two computers that I want to work with them, so my workflow is to host the catalog and photos on a USB key drive permanently, I save on my drive internal, so I always have a copy backup of all that in the case where I drop the USB.  There is no export and import several times, just open the catalog on two different computers.  I have the USB key mapped as the same drive letter on both computers, so I don't have to mess with the remapping of the records.

    USB 3.0 portable drives are fast enough.  If it is a Mac environment I believe that there is a FireWire interface reasonably fast, too.

    ==

    However, if you want to work with pictures on the local disk, what is the problem with giving the wizard a copy of all of the LR catalog (which is relatively low compared to the size of the photos, in any case) and the range of photos via a share on their computer, allowing them to rate the photos, then simply copying the catalog back share?  Make sure the path to the pictures (and to avoid confusion, the catalog) are the same on both computers and it should work ok.  The two computers could access photos via a local drive letter, not a network share.  Just copy is made through the share.  You probably do not want to overwrite your catalog with their copy immediately in case they changed something inadvertently, but eventually you can serve additional copies on your side as serve you the backups that accumulate over time.

    If there's no privacy with what you have in your large catalog, they would not have previews or photos so would not be able to see things, just read the key words.  If it is not quite private, just segment your catalog in addition to one and put the photos that the wizard is allowed to see in a catalog that do you the above process with and let your photos more private in another catalog.

  • Status of active 7 roles Web site with PowerShell.

    I searched for a way to monitor the site to an active role with PowerShell. I was stuck when you pull into a page request via Invoke-WebRequest and System.Net.WebClient.

    Example 1:
    $ARSURL = ""https://ars.org " "
    $Creds = get-Credential
    Invoke-WebRequest - Uri $URL - Credential $Creds
    And I get the following error: "Server error: Invoke-WebRequest.
    401-Unauthorized: access is denied due to credentials not valid. »

    Example 2:
    $webclient = new-object System.Net.WebClient
    $webclient. Credentials = new System.Net.NetworkCredential ($username, $password, $domain) object
    $webpage = $webclient. DownloadString ($url)
    And I get the following error: Exception calling 'DownloadString' with '1 ': arguments
    "The remote server returned an error: (401) unauthorized."

    Any idea on how I can connect on the site of ARS through PowerShell?

    Well - I answered a question about ARS 7 without thinking about it - I'm still not there.  This answer applies to the ARS 6.9... Can also be applied to Version 7 - if the page can be called directly.  If not, your mileage may vary.

    If you're still with me... I played with it a little too there is more to explore... but here is a style of performing the task.

    For me - a quick way to check the availability of a web site of ARS would be to remove the page and display the user ID connected on ARS service and web version + the status of the page.

    We maintain 4 distinct groups of servers ARS, and I spend the $ARSURL below three web servers ARS to check that the site is up, and he is currently pointed to the good host ARS service partner.

    $ARSURL = "myhost.myplace.com/.../About.ASPX ".

    $page = call-WebRequest - Uri $ARSURL -UseDefaultCredentials

    ($page.allelements | where {$_.tagname - eq "TBODY"}) .innertext

    Returns:

    Current user: DOMAIN\USERID

    Role: ActiveRoles administrator

    The Web Interface version: 6.9.0.5483

    Administrative service: myhostinGA.myplace.com

    Version of Service administration: 6.9.0.5483

    Ok

    Ok

    You can also - if you look at something in the script of automation in a status report - I'd go with the simple $page. StatusCode - or $page. StatusDescription

    the HTHML code for a good result is 200... I have only seen two codes is available or not.  If this is not the 'code' returned is 500.  If the page is building / busy - you will get a long wait and a "timeout."

    $ARSURL = "myhost.myplace.com/.../About.ASPX ".

    $page = call-WebRequest - Uri $ARSURL -UseDefaultCredentials

    $page. StatusDescription

    Returns

    Ok
  • How to create POP or IMAP with Powershell accounts?

    I want to create accounts POP or IMAP with Powershell on Windows 7, 8 and 10. Someone knows how to do this?
    Thank you!

    This is the forum for PowerShell scripts. When reposting, remember to indicate the e-mail client for which you want to create accounts. There is a lot of them and most are not scriptable.
    https://social.technet.Microsoft.com/forums/scriptcenter/en-us/home?Forum=iTCG

  • Cancel a defragmentation with Powershell (Server 2008 R2)

    Hello world

    We defrag our servers with powershell all night (original is more complex).

    $AllVols = get-WmiObject-class Win32_Volume - filter "Drivetype = 3 '.

    foreach ($Vol in $AllVols) {$vol. Defrag ($false)}

    Now, we want to cancel defragmentation gently in powershell, when it reaches a specific time (for example, 06:00).

    (like the dfrgui.exe button)

    Any ideas?

    Thank you

    In order to diagnose your problem, we need run Windows performance toolkit, the instructions that are in this wiki

    If you have any questions do not hesitate to ask

    Please run the trace when you encounter the problem
  • Help using office-API - getStatsOfResourcesCSV with Powershell Invoke-RestMethod

    Hi all

    I want to extract metrics data vRops in CSV, I found getStatsOfResourcesCSV but can not do work... the file comes out in JSON or XML, but never in CSV.

    The documentation isn't really clear to me... anyone know how I can CALL him to display the data in CSV?

    Part of the script.

    $ContentType = application/json"; charset = utf-8 ".

    $header = new-Object "System.Collections.Generic.Dictionary [[String] [String]].

    JSON output #For

    $header. Add ("accept", "application/json")

    #For XML output

    #$header. Add ("Accept", "application/xml '")

    Call-RestMethod-method GET - uri " " https://192.168.0.3/suite-api/api/resources/stats?resourceId=UUID1 & resourceId = UUID2 & statKey = cpu | costop_summation & statKey = cpu | usage_average & statKey = mem | statKey & request = mem: usage_average & statKey = cpu | costopPct & rollUpType = AVG & intervalType = DAYS ' - Credential $cred - ContentType $ContentType - Headers $header - OutFile 'Output.csv' '.

    Hello

    A blog around vRops API in general vRops API consumed with Powershell - Michael Ryom

    I haven't played with getStatsOfResourcesCSV and have not had the time to do so - but in general in vRops API, you must specify the format - if you look at my blog you can see that I used '& format = csv' out data in csv instead of xml (I belive you can also do it this way to json).

    Hope this helps

  • Where to write the condition in the design of workflows with conditional step?


    Hello

    Again, I wish BP and workflow with contitional design stage.

    I had designed BP and workflow.

    As directed by the user for Udesigner guide, I had added trigger before the condition step.

    But I do not understand where to set the Condition to test?

    If I need to write the condition in triggering elemt itsel... How to proceed?

    for example, I want to write the condition as cost of Tota > = 100000, it must follow a path and if fails to another path.

    The conditions of triggers are specified in the workflow settings.  After you have created a new configuration, open it and click on the settings tab.  You will see a tree structure of the workflow with conditional branches.  Click a conditional branch to set the parameters for the trigger for this route condition.

  • Passing the parameter of workflow with shows the http 404 error pages

    Hello

    I have a parent workflow with fragments of page deleted as a region on a page. The fragment of a page in the parent workflow displays the read-only table employees. The service Id of the table appears as a link. When you click this link, the following is provided:

    • action listener is called which calls the managed bean that retrieves this Department Id and defines in the pageFlowScope.
    • Pass this value as an input to the child workflow parameter. Child workflow contains its display type is inline-popup page.

    The problem is that when there is no parameter passed to the child workflow the popup is fine. But if the parameter is passed it gives the following error:

    ADF_FACES - 60105:HTTP error state Code: 401.

    Parameters passed from parent to child, such as:

    #{pageFlowScope.DepartmentIdBean.value}

    (DepartmentIdBean is the bean class that gets and sets the Id selected Department)

    Parameter received in the child as workflows:

    #{pageFlowScope.pdeptId}

    Can someone please help me solve this problem? Is that the path parameter is spent creating problem? The same scenario works very well if the child workflow is invoked with fragments of page and like exterior window. I use JDev 12 c.

    I looked at the code and modify it to make it work. There were a few errors. the way main reason you got the error was that you tried to read the non-existing parameter values.

    Download the app from work OTNempDeptTaskFlow.zip | JDev & amp; Goodies ADF

    After downloading the doc you rename to zip and can then decompress.

    Timo

  • Importing data with impdp table with 3 new columns

    Hello

    Is it possible to import data with impdp in tables with 3 new columns?

    Kind regards

    William

    To do this, I use this method:

    Add the three columns in the source table and create the package:

    CREATE OR REPLACE PACKAGE DATAPUMP_TECH_COLS

    IS

    FUNCTION SET_DML_DATE (p1 in TIMESTAMP)

    BACK TO TIMESTAMP;

    FUNCTION SET_DML_TYPE (p2 in VARCHAR2)

    RETURN VARCHAR2;

    FUNCTION SET_DML_SCN (p3 in NUMBER)

    RETURN NUMBER;

    END DATAPUMP_TECH_COLS;

    /

    CREATE OR REPLACE PACKAGE BODY SYS. DATAPUMP_TECH_COLS

    IS

    FUNCTION SET_DML_DATE (p1 in TIMESTAMP)

    RETURNS THE TIMESTAMP

    IS

    BEGIN

    SYSDATE RETURN;

    END;

    FUNCTION SET_DML_TYPE (p2 in VARCHAR2)

    RETURN VARCHAR2

    IS

    BEGIN

    RETURN ' ';

    END;

    FUNCTION SET_DML_SCN (p3 in NUMBER)

    RETURN NUMBER

    IS

    BEGIN

    RETURN 0;

    END;

    END;

    /

    Export a table with remap_data

    expdp = TEMP_DIR PARALLEL = 8 TABLES directory is PIVOTMAT2. ACCTG_LINE LOGFILE = expdp_acctg.log = COMPRESSION STATISTICS ALL EXCLUDE =.

    DUMPFILE = ACCTG1.dmp, ACCTG2.dmp, ACCTG3.dmp, ACCTG4.dmp, ACCTG5.dmp, ACCTG6.dmp, ACCTG7.dmp, ACCTG8.dmp REUSE_DUMPFILES = YES.

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_TYPE:SYS. DATAPUMP_TECH_COLS. SET_DML_TYPE.------

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_DATE:SYS. DATAPUMP_TECH_COLS. SET_DML_DATE.------

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_SCN:SYS. DATAPUMP_TECH_COLS. SET_DML_SCN

    Import table

    Impdp "" / as sysdba "" DIRECTORY = SRC_PIVOT TABLE_EXISTS_ACTION = TRONQUER REMAP_SCHEMA = PIVOTMAT2:STGPIV.

    DUMPFILE = ACCTG1.dmp, ACCTG2.dmp, ACCTG3.dmp, ACCTG4.dmp, ACCTG5.dmp, ACCTG6.dmp, ACCTG7.dmp, ACCTG8.dmp PARALLEL = 8

    to complete the removal of the collar if necessary.

  • How can I control Illustrator with Powershell?

    I need to automate the conversion of images SVG to monochrome TIFF with Group 4 compression. Illustrator is the only tool I found which allows convert SVG files, that I work with precision. I want to automate the process by using Powershell. Is there a more appropriate tool to use for this task than Illustrator? If so, what is it? Otherwise, there are examples of control Illustrator with Powershell?

    at the base...

    $iapp = New-Object -ComObject Illustrator.Application
    $idoc = $iapp.documents.add()
    $itext = $idoc.textFrames.add()
    $itext.contents = "Hello World from PowerShell!"
    
  • Curl call to start the workflow with the Type of table / Composite

    Hello I'm trying to start a workflow with an array type / Composite.

    I'm using curl to make the call and post fields look like this when you send just a string:

    {

    "settings":

    [

    {

    'value ':

    {

    "chain": {"value": "test"}

    },

    "type": "string",.

    "name':"vm. "

    "scope": "local".

    }

    ]

    }

    How would he format if I had a type of table / Composite?

    The composite type has the format of the Type / size and String / number

    Well, I don't know like what your workflow, but here is what looks like my quick test workflow:

    MESSAGE: https://YOUR-VCO-SERVER;8281/vco/api/workflows/475a3967-2c29-4140-a63f-ec1822ec330b/executions/

    Headers:

    Content-Type: application/json

    Accept: application/json

    And here is the json work to get into the body:

    {
        "parameters": [
            {
                "value": {
                    "array": {
                        "elements": [
                            {
                                "composite": {
                                    "type": "CompositeType(Type:string,size:number):drives",
                                    "property": [
                                        {
                                            "id": "Type",
                                            "value": {
                                                "string": {
                                                    "value": "NTFS"
                                                }
                                            }
                                        },
                                        {
                                            "id": "size",
                                            "value": {
                                                "number": {
                                                    "value": 20
                                                }
                                            }
                                        }
                                    ]
                                }
                            },
                            {
                                "composite": {
                                    "type": "CompositeType(Type:string,size:number):drives",
                                    "property": [
                                        {
                                            "id": "Type",
                                            "value": {
                                                "string": {
                                                    "value": "ext3"
                                                }
                                            }
                                        },
                                        {
                                            "id": "size",
                                            "value": {
                                                "number": {
                                                    "value": 25
                                                }
                                            }
                                        }
                                    ]
                                }
                            }
                        ]
                    }
                },
                "type": "Array/CompositeType(Type:string,size:number):drives",
                "name": "drives",
                "scope": "local"
            }
        ]
    }
    

    I've included the workflow here as well.

    Just a simple script:

    System.debug("Drives:");
    for each (drive in drives){
      System.debug("================");
      System.debug("Type: "+drive.Type);
      System.debug("Size: "+drive.size);
    }
    

    My results:

  • Develop the Import Preset with your car

    After the passage of the LR5 to LR CC, I think not being able to import photos with your car. I have a develop preset called "general" who uses your car and a few other settings and that was my default import preset for a certain time. Since switching to the CC, everything works except the auto tone. I tried the update the preset without change. The preset works great if I select the image and applicable after import.

    Today I tried to change the "auto Apply your adjustments" in preferences, and then later all of my photos imported with an exhibition between-5 and all the rest to zero.

    Could you please return to the previous Lightroom and then try to use the Preset to develop.

    How backwards: Instructions to restore an earlier version of update

  • Workflow with several collaborators

    Workflow have version control using increment of the version that you are making changes.  Is there something analogous to the branches of the function in the VCO world?  The possibility to merge the changes in?

    If we have several people working on a workflow, im, trying to figure out the best way to present the new changes without replace directly the workflow with the new print (which could break him other tasks dependent on the id)

    I worked on vCO for 8 years, including very large projects and there was very little chance that I had to merge the workflow. Most of the time that could have been avoided by each server in developer of vCO synchronization to a repository Server central vCO.

    And when I needed to do this, the Magnifier tool included in the synchronization of workflow was good enough for me to manage change. Maybe it will work for you as well (right click / Synchronize).

  • Run a workflow with REST APIs

    Hi all

    I'm not able to run the workflow with the REST api.

    POST https:// < < HOST > >: < < PORT > > / api/workflows/94dd5f20-c190-4023-a0a7-1589c46f3792/executions/

    Here is the answer. VCO is not all workflow runs so I assumed nothing was performed. I can run to the https://wxpcpvcd006a:8281 / api/stream/f29ebe52-27b2-42d1-84f1-6ecfb939326c to get all the info on the workflow. Any help appreciated.

    {

    'relations': {}

    "total": 0.

    "link":]

    {

                    "href": " https://wxpcpvcd006a:8281 / vco/api/stream/94dd5f20-c190-4023-a0a7-1589c46f3792 / ",

    'rel': 'up '.

    },

    {

                    "href": " https://wxpcpvcd006a:8281 / vco/api/stream/94dd5f20-c190-4023-a0a7-1589c46f3792/executions/ ",

    'rel': 'Add '.

    }

    ]

    }

    }

    Simply enter the braces of opening and closing as your json content:

    {}
    

Maybe you are looking for