HFM data extract

Hello
When I extracted data using the HFM client, the first line of the file is! Data.
I load the file in a SQL server using Bulk insert table.
I wanted to skip first line, if I use FIRSTROW = 2, it ignores the first line of data.

When to skip lines, the SQL Server database engine looks only at the field terminators, and does not validate the data in the fields of ignored rows.
With this statement, it would compare the fields in the table and ignore the number of endings to the field to form a line and now the line terminator.

I can extract the data of HFM without «!» Data line"in a file? or can I pull data into a sql table directly using utilities HFM etc.?

Thank you

Discover Extended analytics in the Administrator's Guide. I believe that you will get exactly what you are looking for.

Tags: Business Intelligence

Similar Questions

  • HFM data extract Taskflow with a possible dynamic POV?

    Play with a Taskflow running one extracted data HFM and I was wondering if the POV can be defined dynamically from a variable or something like that with another stage.  Did someone do something like that?

    Thank you

    Hello. According to the version with which you work, using lists of members to define the variables such as the current year and the current period. When you update the lists, the workflow runs with these settings. Useful for forms, reports, etc., as well.

    Eric

  • How to load HFM data into Essbase

    Hello

    How can bring us the HFM data into Essbase cube with using EAL, since we have performance using EAL - DSS as a source with OBIEE reporting problems.

    The expanded use of analytical, I heard we can only only level 0 of HFM Essbase data and need to write the currency conversion and the calculations of eliminating PKI in Essbase to roll the Member parent. Is this true?

    Also how we can convert security HFM to Essbase security.

    Please suggest me on this.

    Thank you
    Vishal

    Security will be a bit tricky as Essbase uses generally filters and HFM security classes. You can potentially use shared groups in Shared Services, but licensing issues soar depending on supply. Your best bet is maybe to look at export artifact LCM to handle.

  • Data extract Essbase to Oracle DB using report Script

    I get an error saying that ODI can not locate my report script. My essbase is on a different server from ODI. Can I copy the script on the server of the ODI report? Is that the way to solve this problem?

    Documentation
    "Validation of the column is not executed during data extraction using report scripts. Thus, the output of a script to report columns is directly mapped to the corresponding column connected in the source model. »

    This means that the order of the columns in the report script must be in the exact order as the columns in the source Essbase model for extraction to perform successfully.

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • need help with hex and handling data extraction

    Hello

    I have able to transmit hexagonal commands on the serial port of my camera and receive responses in hex him successfully. I was able to confirm this by changing the display format to the spell for the control and the indicator. Now, I want to store the portion of the hexagonal received data and use it in my future orders. Example of received data as below:

    0001 1100 0010 E002 6 93 4D7E 2C4D AA66 F00D

    The red blocks will be incorporated in the next command so that when I sent this command it will look like this:

    F00D 0014 2000 0003 0000 0020 77AF E002 6 93 4D7E 2C4D

    In short, send the command and get response, extract and store data in red blocks, then send different control that contains the data that is stored. Everything is in hexadecimal format. Question is how can I extract/store the bytes wanted to and then rebuild a hex command containing these bytes? Can I convert it to normal string, byte array, or hexagonal table? If we use the latest methods of 2 conversion, I need to ensure that the zeros are not lost.

    I enclose my VI. Please take a look and help to provide a solution.

    Thanks in advance.

    Here, s what I had in mind (LabVIEW 2010):

  • Eloqua data extraction for use in the analytical software

    Topliners,

    I use Eloqua 9 and I would like to extract customer information on send individual email so that I can perform trend analyses & regression in analysis tools of data to third parties such as table & SPSS.

    My hope is that I can extract the data from the customer (State / Zip Code / custom fields) and perform only basic e-mail statistics data (sent / open / click / etc...) using the email address as primary key.

    Anyone who has tried to make such a report from Eloqua or used the API to send log extract / rewrite table / customer profile data for using third-party tools?

    Any advice would be greatly appreciated.

    Thank you

    Corey Bauer

    Hi Corey,.

    You can watch export report planned for report admin and those to be transferred by ftp to your server. This will give you the Contact, Email and campaign reports reports.

    Thank you

    Amit

  • I need to simplify the csv of a pdf form data extraction

    I have a complex PDF with fields to fill out. Once these are met, I want to extract the data in the fields, for insertion into a database.

    I know that I can use the form to prepare, then more. Export data... will create a .fdf file. Then I can use Merge Data Files into spreadsheet... to generate a .csv file that contains all the data in. Then, I wrote code to extract this data and put it in my database of mssqlserver.

    Several steps are required and I'd like to simplify as much as possible.

    From the very beginning.

    1. when I select export data, I am presented with a dialog box that uses 'Adobe_data.fdf Test' as the default file name. This default can be changed? By default, he can name that I used last time?

    2. when I select merge data files into spreadsheet I am presented with a dialog box requesting the file. It cannot by default the file .fdf, that I just created? This dialog box includes a check box (last Include list of files to export data since) but I don't want the latest list, I want the .fdf file I just created by default. In addition, this procedure aims to 'fusion' of the data files. I want to just "merge" one file! The only reason I use it is it seems to be the only way to convert the .fdf in a (pseudo) - implemented) .csv file.

    3. when I click on 'Export' in the dialog box, I have to retype the name of the file .fdf, since this type of file is not a part of the 'save as type' so I can't return the name.

    4. This step ends with another dialog (export Progress) with a View file buttons now and dialogue narrow. I have to click the close dialog box.

    It sounds like a large number of steps, and it will be more difficult it should be to train someone to review these procedures.

    Is there a better (easier way?

    Thank you.

    roricka

    You call with each click a 'step '? It is a bit exaggerated...

    Anyway, you export the values of a single file or several files? If a single file then this task can be scripted, Yes, and the script can be attached to a button that can be triggered by a simple click.

    If several files, then it is much more complicated and would require using an Action and a button, and I don't think that he will save "measures", compared to the original function.

  • Parshing and XML [V 10 G] data extraction

    Hi Experts,

    I have an XML CD catalog file. I extract the values of the fields in xml format and load it into a table of the same number of columns.

    can someone help me how can I extract XML data

    Example of XMl

    --------------------

    <!-edited by XMLSpy®->

    <>CATALOG

    < CD > < TITLE > Empire Burlesque < / TITLE > < ARTIST > Bob Dylan < / ARTIST > < COUNTRY > USA < / COUNTRIES > < Columbia COMPANY > < / COMPANY > < PRICE > 10.90 < / PRICE > < YEAR > 1985 < / YEAR > < / CD >

    < CD > < TITLE > hide your heart < / TITLE > < ARTIST > Bonnie Tyler < / ARTIST > < UK COUNTRY > < / COUNTRIES > < COMPANY > CBS records < / COMPANY > < PRICE > 9.90 < / PRICE > < YEAR > 1988 < / YEAR > < CD >

    Greatest Hits < CD > < TITLE > < / TITLE > < ARTIST > Dolly Parton < / ARTIST > < COUNTRY > USA < / COUNTRIES > < COMPANY > RCA < / COMPANY > < PRICE > 9.90 < / PRICE > < YEAR > 1982 < / YEAR > < CD >

    < CD > < TITLE > Still you have the blues < / TITLE > < ARTIST > Gary Moore < / ARTIST > < UK COUNTRY > < / COUNTRIES > < COMPANY > Virgin records < / COMPANY > < PRICE > 10.20 < / PRICE > < YEAR > 1990 < / YEAR > < / CD >

    Eros < CD > < TITLE > < / TITLE > < ARTIST > Eros Ramazzotti < / ARTIST > < EU COUNTRIES > < / COUNTRIES > < BMG COMPANY > < / COMPANY > < PRICE > 9.90 < / PRICE > < YEAR > 1997 < / YEAR > < / CD

    < / CATALOGUE >

    Thank you

    -----------------

    Emmeline

    with

    xml_source as

    (select q' {})

    Empire BurlesqueBob Dylan U.S.. Columbia 10.90 1985

    Hide your heartBonnie Tyler UK CBS Records 9.90 1988

    Greatest HitsDolly Parton É.-u. RCA 9.90 1982

    Always you have the bluesGary Moore UK Virgin records 10.20 1990

    ErosEros Ramazzotti EU BMG 9.90 1997 >

    }' the_xml

    of the double

    )

    Select x.*

    of xml_source s.

    XMLTable ('/ CATALOGUE/CD ')

    by the way of xmltype (s.the_xml)

    columns 'TITLE' path of the varchar2 (30) "TITLE."

    Path of varchar2 (30) "ARTIST", "ARTIST."

    Path varchar2 (5) of 'COUNTRY', 'COUNTRY ', he said.

    Path varchar2 (30) "Company" "SOCIETY."

    "PRICE' number path"PRICE. "

    Path number 'YEAR' 'YEAR '.

    ) x

    TITLE ARTIST COUNTRY COMPANY PRICE YEAR
    Empire Burlesque Bob Dylan U.S.. Columbia 10.9 1985
    Hide your heart Bonnie Tyler UK CBS Records 9.9 1988
    Greatest Hits Dolly Parton U.S.. RCA 9.9 1982
    Always you have the blues Gary Moore UK Virgin records 10.2 1990
    Eros Eros Ramazzotti EUROPEAN UNION BMG 9.9 1997

    Concerning

    Etbin

  • Using EXTRACT XML data extraction

    Hi, I have a XML file where I want to analyze the fields in a table in DB:

    <? XML version = "1.0" encoding = "UTF-8"? >
    < FIXML > < batch > < MktDataFull BizDt = '2012-07-13' > < Instrmt Sym = "JCPRXU" ID = "JCPRXU" Desc = "JCP.SR. XR. «USD"SecTyp ="CD"Src ="H"subtype = MMY" S "="201209"MatDt = ' 2012-09-20" Mult = Exch "0.01" = "CMD" UOM = "CTL" UOMCcy = "USD" UOMQty = "1" PxUOM = "IPNT" ValMeth = "CD" CpnRt = "1.0" IntAcrl = ' 2012-06-20 ' CpnPmt = ' 2012-09-20 ' NotnlPctOut = "100.0" Snrty = 'SR' RstrctTyp = 'XR' DayCntMeth = "ACT/360" tenor = "0 M" > < DITTA HELP = "US708130AC31" AltIDSrc = "105" / > < HELP AltID = "JCP.SR." "" ""» XR. USD.12U.100"AltIDSrc ="101"/ > < HELP DITTA = '1 201209 JCPRXU' AltIDSrc = 'H' / > < DITTA HELP ="1 201209 JCPRXU"AltIDSrc ="100"/ > < Evnt EventTyp ="5"Dt ="2008-09-19"/ > < Evnt EventTyp = '7' Dt = '2012-09-19' / > < Evnt EventTyp ="19"Dt ="2012-10-05"/ > < Evnt EventTyp ="100"Dt ="2012-07-16"/ > < Evnt EventTyp = '8' Dt ="2012-07-14"/ > < Evnt EventTyp = '9' Dt = '2012-09-20' / > < Evnt EventTyp ="101"Dt = '2012-03-20' / > < Evnt EventTyp ="102"Dt ="2008-09-20"/ > < Evnt EventTyp = « 103 » Dt = « 2008-09-22 » / >< Evnt EventTyp = « 104 » Dt = « 2012-09-19 » / >< Evnt EventTyp = « 111 » Dt = « 2012-09-20 » / >< Evnt EventTyp = « 112 » Dt = « 2012-06-20 » / >< Evnt EventTyp = « 113 » Dt = « 2012-03-20 » / >< Evnt EventTyp = « 114 » Dt = « 2012-07-12 » / >< Evnt EventTyp = « 115 » Dt = « 2012-07-16 » / >< / Instrmt >< complet Typ = « 6 » Px = Mkt « 99.7433368 » = « CMD » QCond = « 6 » PxTyp = « 1 » OpenClsSettlFlag = « 1 » >< / Full >< complet Typ = « 6 » Px = « 234.5254 » Mkt = QCond « CMD » = « 6 » PxTyp = « 6 » ' OpenClsSettlFlag = '1' > < / Full > < full Typ = "Y" Px = Mkt "40.0" = 'CMD' PxTyp = '1' OpenClsSettlFlag = '1' > < / Full > < full Typ = '6' Px = "234.5212" Mkt = QCond 'CMD' = '7' PxTyp = '6' OpenClsSettlFlag = '1' > < / Full > < full Typ = Mkt "B" = 'CMD' OpenClsSettlFlag = '4' Sz = '0' > < / Full > < full type = 'C' Mkt = 'CMD' OpenClsSettlFlag = '4' Sz = '0' > < / Full > < full Typ = 'z' Px = Mkt '0.18' = 'CMD' PxTyp = '1' OpenClsSettlFlag = '1 '. > < / full > < full Typ = 'y' Px = "0.1899965" Mkt = QCond 'CMD' = '6' PxTyp = '5' OpenClsSettlFlag = '1' > < / Full > < InstrmtExt > < Attrb Typ = '100' Val = '24' / > < Attrb Typ = '101' Val = '0' / > < Attrb Typ = '109' Val = '0' / > < Attrb Typ = '103' Val = '24' / > < Attrb Typ = '102' Val = '24' / > < Attrb Typ = '110' Val = '3' / > < Attrb Typ = '29' Val = 'Y' / > < Attrb Typ = '112' Val = 'Y ' /. > < / InstrmtExt > < / MktDataFull > < / batch > < / FIXML >


    Right now, I'm just trying to extract the first 3 fields, BizDt, Bal and id I use to analyze the following:


    SELECT
    Extract (value (p), '/BizDt') .getStringVal () AS DATE_,.
    Extract (value (p), ' /Instrmt/Sym').getStringVal (AS SYMBOL),)
    Extract (value (p), ' /Instrmt/ID').getStringVal () AS ID_)

    OF s TABLE_NAME.
    TABLE (XMLSEQUENCE (Extract (xmlType.createXml (s.CDS_CLOB), ' FIXML/batch/MktDataFull / *'))) p
    WHERE s.ID_ = 1

    But I get nothing back. My guess is the because the XML data does not have opening tags and formal closing, because if I change my XML like this:

    <? XML version = "1.0" encoding = "UTF-8"? >
    < FIXML > < batch > < MktDataFull > < BizDt > 2012 - 07 - 13 < / BizDt > < Instrmt > < Sym > JCPRXU < / Sym > < ID > JCPRXU < /ID > < Desc > JCP.SR. XR. USD < / Desc > < SecTyp > CD < / SecTyp > < / Instrmt > < / MktDataFull > < / batch > < / FIXML >

    I was able to get the data. Therefore, in order so solve this problem, what should I do with my original XML? Can I format the tags?

    Thank you

    When you nest xsl: for each of the elements, the select expression is evaluated in the context of the enclosing instance.

    Consider this:

    
      
    

    This means you are trying to match items complete as children Instrmt, that is not correct, because they are actually siblings.
    In the same goes for HELP, Evnt etc.

    I don't know what kind of structure you want.
    Caps all does not much sense given that groups of brothers of repeat items that have no apparent correlation between them. Essentially, you end up with a big Cartesian product.

    I would approach this by storing repeated elements in different tables with a parent/child relationship to preserve the hierarchical nature of the data (if necessary).

  • HFM data tables

    Hi guru,.

    I use HFM v4 and I was wondering if anyone is familiar with the tables. I have a request to the data table (AppName_DCE_3_2011) 1. I can find the join to lEntity, lAccount, Acpb, lCustom1, lCustom2, lCustom3 and lCustom4 with the exception of lValue , which consist of 2 data: 17 and 56. I have search in the list of table that can relate with lValue and I can only find AppName_VALUE_ITEM table, but the data does not match with the value of data.

    Is there anyone know what is the meaning of 17 and 56 inside the lValue column?

    Very much appreciate your help.
    Thank you
    Anna

    Value item ID (an lValue) dimension for the tables of the WFD are derived from the table of CURRENCIES. The formula should be
    (Currency ITEMID x 3) + 15 = Total of currency
    (Currency ITEMID x 3) + 16 = currency Adj
    (Currency ITEMID x 3) + 17 = currency

    For example, if your first currency is USD (ITEMID = 0) and your second currency CAD (ITEMID = 1)
    $ Total = 15
    Adjs USD = 16
    USD = 17
    Total CAD = 18
    CAD Adjs = 19
    CAD = 20

    The offset of 15 are members of the dimension value in the VALUE_ITEM table.

    -Keith

  • 6.3.1 Hyperion enterprise data extraction error

    I extracted the data and at the end he returns me an error for 2 entities. The problem is extarcting of data from an entity that does not exist. No idea why?

    There is no newspaper published in it.

    This is the error log.


    ! \\tdwm-aawhy-cp01\entdata\FSS\IFRS_YE_Rollover_R1\Extract data AL_IFRS ACTMAVG.txt
    ACTMAVG
    1
    12
    D0786. ADJML, N250099. CDN. RCDN, 1806525.000000, 1755614.000000, 1768445.000000, 1780035.000000,
    D0786. ADJML, N257099. CDN. RCDN, - 16904.000000, - 15093.000000, - 15093.000000-, 15093.000000,.
    D0786. ADJML, N607599. CDN. RCDN, 17694.000000, 17191.000000, 17335.000000, 17392.000000,
    D0786. ADJML, P200099. CDN. RCDN, 2644.000000, 2500.000000, 2514.000000, 2529.000000,
    D0786. ADJML, P200099. CDN. ROTH, 4694.000000, 4694.000000, 4694.000000, 4694.000000,
    D0786. ADJML, P300599. CDN. RCDN, 49977.000000, 403.000000, 12897.000000, 24182.000000,
    D0786. ADJML, P301199. CDN. RCDN, 1750000.000000, 1750000.000000, 1750000.000000.
    D0786. ADJML, P304099. CDN. RCDN, 115.000000, 582.000000, 929.000000.
    D0786. ADJML, P301189. CDN. RCDN, 1750000.000000.
    D1124. ADJML, N250099. CDN. RCDN, 360104.000000, 352066.000000, 354059.000000, 356051.000000,
    D1124. ADJML, P200099. CDN. RCDN, 2075.000000, 2000.000000, 2008.000000, 2015.000000,
    D1124. ADJML, P200099. CDN. ROTH, 39.000000, - 7.000000, - 7.000000, - 7.000000,
    D1124. ADJML, P203099. CDN. RCDN 0.000000-1.000000 0.000000, 0.000000,.
    D1124. ADJML, P300599. CDN. RCDN, 7990.000000, 66.000000, 2047.000000, 4028.000000,
    D1124. ADJML, P301199. CDN. RCDN, 350000.000000, 350000.000000, 350000.000000.
    D1124. ADJML, P304099. CDN. RCDN, 8.000000, 11.000000, 15.000000.
    D1124. ADJML, P301189. CDN. RCDN, 350000.000000.
    ! \\tdwm-aawhy-cp01\entdata\FSS\IFRS_YE_Rollover_R1\Extract data AL_IFRS ACTMENT.txt
    ACTMEND
    1
    12
    D0786. ADJML, N250099. CDN. RCDN, 1806525.000000, 1755614.000000, 1768445.000000, 1780035.000000,
    D0786. ADJML, N257099. CDN. RCDN, - 16904.000000, - 15093.000000, - 15093.000000-, 15093.000000,.
    D0786. ADJML, N607599. CDN. RCDN, 17694.000000, 17191.000000, 17335.000000, 17392.000000,
    D0786. ADJML, P200099. CDN. RCDN, 2644.000000, 2500.000000, 2514.000000, 2529.000000,
    D0786. ADJML, P200099. CDN. ROTH, 4694.000000, 4694.000000, 4694.000000, 4694.000000,
    D0786. ADJML, P300599. CDN. RCDN, 49977.000000, 403.000000, 12897.000000, 24182.000000,
    D0786. ADJML, P301199. CDN. RCDN, 1750000.000000, 1750000.000000, 1750000.000000.
    D0786. ADJML, P304099. CDN. RCDN, 115.000000, 582.000000, 929.000000.
    D0786. ADJML, P301189. CDN. RCDN, 1750000.000000.
    D1124. ADJML, N250099. CDN. RCDN, 360104.000000, 352066.000000, 354059.000000, 356051.000000,
    D1124. ADJML, P200099. CDN. RCDN, 2075.000000, 2000.000000, 2008.000000, 2015.000000,
    D1124. ADJML, P200099. CDN. ROTH, 39.000000, - 7.000000, - 7.000000, - 7.000000,
    D1124. ADJML, P203099. CDN. RCDN 0.000000-1.000000 0.000000, 0.000000,.
    D1124. ADJML, P300599. CDN. RCDN, 7990.000000, 66.000000, 2047.000000, 4028.000000,
    D1124. ADJML, P301199. CDN. RCDN, 350000.000000, 350000.000000, 350000.000000.
    D1124. ADJML, P304099. CDN. RCDN, 8.000000, 11.000000, 15.000000.
    D1124. ADJML, P301189. CDN. RCDN, 350000.000000.

    If you are sure that the entity does not exist, try using the functionality of the entities. Go for the features module, in the main menu select task-> entities without owner and run the report of unknown entities, if the entity that is the cause of the problem is there, run the task of entities without owner of purge. Make sure that you are serving other entities you need.

  • Data extracted from planning to DWH classic planning license using ODI

    We can extract data from the data warehouse planning with Classic planning application license using ODI?

    We currently use ODI to load data into the planning.


    Thanks in advance

    You will need to check with your Oracle account manager to see which adapters that you are approved for the.
    If you have a license for the essbase adapter which I believe you should be, then, you will be able to extract data

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • Data extract - source ASO to BSO target

    Hello

    Client wants to retrieve wound of members of a dimension sku # a Cube ASO to the BSO Cube. Excerpts from OSI to ASO data are normally quite simple with DATAEXPORT. However, since ASO cannot run traditional calc scripts I'm wondering if there are methods to extract data from ASO to OSB? ASO does not store the values t rollled - it? So if it not is not wouldn't that eliminate partitioning option? Also, does ASO sopport report Scripts? Curious as not a guru of the ASO.

    Thank you

    You can have an XREF on the OSB by extracting data from the database of the ASO, though I wouldn't have thought that would help and xrefs can be slow.
    Try other options:- Re: extraction of ASO cube data

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • ERPI - data extracted from the subledger Oracle EBS

    Hello

    Is this possible with ERPI (11.1.1.3) to extract data from the subsidiary ledger for capital in Suite E - Biz (11.5.10), or can it only be used for firing from Oracle GL?)

    Thank you
    Matt

    Hello

    ERPi draws only from General Ledger of the EBS.

    Thank you

  • Show ToolTips and line graph point cloud where data extracted by SQL


    I'm looking to create a scatter chart to display the results of a SQL query. The graphic appears as

    anychart.png

    I want to have the points to is joined by a line and - bubbles that appear when a single point are selected. My existing custom XML is displayed at the bottom of this post. I tried different formats between the < / tooltip_settings > tags, but nothing appears. Similarly, I tried different line format settings. Interestingly, if the string #DATA # is replaced with a block as...

    < data >

    < name of series = 'Dirty' type = 'Line' >

    < y = '1.172' point x = "1063512000" / >

    < y = '1.916' point x = "1095048000" / >

    < point y = "5.57" x = "1126584000" / >

    < point y = "15.0" x = "1158120000" / >

    < point y = "144" x = "1189656000" / >

    < / series >

    < / data >

    ... then a line and bubbles appear. It seems that all that is in the #DATA block # is not enough to create a line, and ToolTips. The data is retrieved by a query that returns the 4 columns LINK, TAG, X_VALUE, Y_VALUE

    Is there a way in which something can be configured for line and tool tips to appear?

    Existing custom XML

    <? XML version = "1.0" encoding = "UTF-8"? >

    < anychart >

    < Parameters >

    < local >

    < date_time_format >

    < format > %u < / size >

    < / date_time_format >

    < / locale >

    < / Parameter >

    < graphics >

    < graphic plot_type 'Scatter' = >

    < data_plot_settings >

    < line_series >

    < tooltip_settings enabled = "True" >

    < / tooltip_settings >

    < / line_series >

    < / data_plot_settings >

    < chart_settings >

    < title >

    Scatter line < text > < / text >

    < /title >

    < axes >

    < y_axis >

    < title >

    < text > County < / text >

    < /title >

    < scale type = "Linear" / >

    <>labels

    < format > {value %} {numDecimals:0} < / size >

    < / Label >

    < / y_axis >

    < x_axis >

    < scale type = 'DateTime' major_interval = '1' minor_interval = '3' major_interval_unit = 'Year' minor_interval_unit = 'Months' minimum_offset = '0' maximum_offset = '0' / >

    <>labels

    < format > {value %} {dateTimeFormat: % YYYY} < / size >

    < / Label >

    < title enabled = "true" >

    < text > Date < / text >

    < /title >

    < / x_axis >

    < / axis >

    < / chart_settings >

    #DATA #

    < / chart >

    < / charts >

    < / anychart >

    This is because the apex generates the series not as a type of line, but as a type of marker:

    
    
    
    
    ...
    

    You will need to provide the data yourself, in the format you want, that is to say a series of line type. You can't set it up.

Maybe you are looking for