Drift in wide temperature in 6289

Hello

We use 6289 for one of our application where we need to sweep every second between tension. For a long time, we were concerned about the algorithm we use to calculate the tension. But recently when we did a loop to check exactly what is happening, we saw a significant temperature drift in the tensions.

more than five minutes period we experience drifting up to 0.5 mV. For our application, the absolute resolution that we work with is very low. That's why 0.5 mV of deviation leads to a change of major importance.

The reason that we have concluded that it's what drift in temperature is that earlierr the drive was very high. so when we placed an external fan near the MXI chasis, drift began to change and more 2-3 mV in 5 minutes. but now we have ensured that the temperature the MXI is perfectly consistent. the room temperature is close to 18 ° c. But still, we are facing this drift.

We already checked the FAN of MXI chasis and it works very well. There we are moving to high speed.

Here is the configuration that we work with.

6289 / ao2-> 09:50 V

6289 / ao3->-1 to + 1 V

and the value of 6289/ao2 is configured as compensation for 6289/ao3.

Please see the attached photo of the loopback test results.

Please tell me how to stabilize the output so that I can use for my application. with the present drift in tension, it is not very suitable for our application.

Thank you and best regards,

Alok dandois.


Tags: NI Hardware

Similar Questions

  • LabView Thermocouple temperature readings

    I use the SCXI-1001 with SCXI-1102 and SCXI-1303 (LabView 8.2) in record temperatures of thermocouples type k.  My problem is that the temperatures that I see on LabView are off by about 5-10 ° c. AND they stray +/-10 ° c throughout the day.  What I've read, I think that this could be linked to the CJC.  Right now, I just CYC Source set "Permanent" with a value of "25".  I can make adjustments to it to help with the lag, but the drift just of temperatures throughout the day and the readings are again incorrect.  The room temperature changes somewhat in the region where I have this configuration, and drift in temperature seems to be linked to that.  For any help or suggestion would be appreciated. Thank you.

    Hi Zawer,

    Your SCXI-1303 module has a welding temperature sensor cold precision thermistor, so the 'Integrated' parameter must be supported.  Try resetting the 1102 able and Automation Explorer, then restart LabVIEW.  Is thrown this error yet?

  • iPhone 6 stops at a cool temperature

    I started to have the question last winter. When I'm out in the cold, my phone suddenly stops any percentage of the battery is on the left. Even in the milder climate (5 ° C in March to the Canada) my iPhone 6 won't last a House of travel bike 20 minutes while in the inside pocket of my jacket. Once I go home, I can turn right back and it will support the rest of the day on the same charge.

    So, it seems that the battery cannot provide the necessary power for the phone to operate in temperatures. But count me can last all day when I am on the inside. Everyone knows this? And a replacement of the battery could solve this problem?

    Have you already checked this?

    Your device is designed to perform well in a wide range of ambient temperatures, with 62 ° to 72 ° F (16 ° to 22 ° C) in the area of ideal comfort. It is particularly important not expose your device to ambient temperatures above 95 ° F (35 ° C), which can damage the battery capacity. In other words, your battery will not power your device as long on a given load. Charge the unit at high ambient temperatures can damage it more. Even storing a battery in a hot environment can damage it irreversibly. When you use your device in a very cold environment, you may notice a decrease in battery life, but this is temporary. When the battery temperature returns to its normal operating range, its performance will return to normal as well.

    copied from: https://www.apple.com/batteries/maximizing-performance/

  • HP ENVY 15: WiDi on Windows 10 gives black screen.

    Attempted to connect a TV to my desire to 15 through a Microsoft display adapter, but only get a black screen when connected.

    Product envy num is L0L25PA #ABG.

    Microsoft display adapter model is 1628 with new firmware from 1.3.8220.

    Intel (r) worm 6.0.40.0 WiDi.

    The MS display adapter has the latest Miracast driver see 10.0.10240.16394 downloads of Microsoft.

    The display adapter brings up a ready-to-connect "display in 1080 p with a black background.

    Once connected, the laptop says it is "connected to the MS display adapter" and the TV goes black.

    This dual-screen development. The TV becomes a kind of signal from the WiDi, but this isn't a "screen" as with no. signal the TV is blue. "

    So I'm not sending ether does not correct data to the Member States display adapter or the adapter is not properly translated.

    The adapter has accepted an update for it via the WiDi who ran ok. All the screen during the update wasn't on the portable TV.

    Can do all the right things on the TV with an HDMI cable.

    Is there a fix for this or I have to wait while the MS, Intel, HP sorting?

    Windows 10 is fully updated.

    The MS display adapter was not winning in 8.1.

    DAY FOLLOWING.

    Found an update of the driver Intel(r) HD Graphics 5500. The Intel verification app update does not pick up this one, I found it when checking the drivers in Device Manager.

    I can now get a projection, but it's a bit and on everything when he wants to work. Once it starts, it is good, all that either with a slight flicker/stutters when the video display on the TV. Sometimes it takes a reset of the ILO and a new attempt so he could start as for the moment, to the extent where I wonder if there is a hardware problem BOUNCING temperature with Member States display adapter.

    Having found a reset button on the adapter has helped.

    Another tomorrow.

    I found that my problem is the TV I use for my screen, a cheap 32 in Vivo that works very well on an HDMI cable. It of very subject bar she receives MS card or not.

    Everything is good when it goes, but you cannot change modes of projection etc without him dropping to and then once that happens it is very difficult to make it show again.

    I found this when I could try a Samsung which worked perfectly. Can change the mode, connect and disconect as much as I like without any problem.

    Thanks for the help HP forum,

    I hope this will help others who have a similar problem.

    How can I say that it is resolved?

    Sorry SDF15, I give you a little nudge upwards or one solved as long as you did not help at all.

    Even your suggestion to show how to mark your question as resolved does not apply in this case that you does not solve the problem.  I did and does the option solved on the original post. If I can do this response, I.

  • Question of timebase NI PCI-5154 digitizer drift

    Hello NOR all awaits them:

    We have a NO-PCI5154, used for several years now. We use it to capture waveforms of impulse which we care about timing relationships.

    We operate the digitizer to sampling of 1 GHz and up to today, we assume the sampling rate is precise and constant. Today, a member of group doubt that since the digitizer specfication said, what the time base drift on "±7 ppm / ° C". So if this is true, suppose we have a Temperation of exploitation that is 20 degrees higher than the temperature at which the scanner has been calibrated, then the derivative can reach up to 140 ppm time 1 GHz which is 140 KHz? It would be a killer of our measures.

    Please help clarify this question, then we can estimate errors in our measures.

    Unfortunately, we have no data on the repeatability of the time base drift.

    To calculate the frequency of real-time database, simply reverse the calculations that we've discussed so far.  Measure a source very precise on the digitizer, and any change in frequency of the signal would be caused by the non-ideal time base period.

    For example, you measure a signal from 10 MHz to 1 GHz, and its frequency is reported as 10,001 MHz.  So, we're out of 1 kHz.  1 kHz = 10 MHz * Xppm, solve for x: X = 100 ppm.  Thus, our sample clock runs at 100 ppm.  1 GHz * 100 ppm gives us a period of 0.9999 ns or ns 1,0001.  As our frequency has increased to 1 kHz, the signal was compressed when being interpreted to 1ns dt.  Thus, the real clock period was 1.0001ns.

    Because it sounds like you can't control the temperature of your work environment, to the more specific measures that you can measure the time base clock drift immediately before and after taking your measurements.  If you have run your tests in a controlled temperature environment, you might be able to get away with a measure not time base clock drift as often, but you should always run regularly.  The reason for this is also due to the effects of aging of the time base oscillator (affects all oscillators).  The accuracy of all the oscillators gradually drift or increase over time.  Our specifications, take account of this drift in the external calibration interval, but if you're going to measure the actual accuracy, the time is another factor that will affect the accuracy of the time base.

    For completeness, I also need to say, that when you measure your test signals ppm accuracy, this shows absolute precision, not only the accuracy of the time base, but also the accuracy of the source of the signal.  So it is very important to have a precise source for the test signals.

    I hope this helps.

    Nathan

  • PXI-6551 dependence on temperature and time of oblique channel

    Hi all

    What is the dependence on temperature of the chisel for the HSDIO PXI-6551-calibrated temperature ambient channel?  Assume that dynamic technology as (after calibration or 'left') the tilt channel 60 HP for the channel 3 to 23.2 ° c.  How this value does not change with temperature?

    There is probably also a known time drift.  How much this will vary over a period of two years?

    Please notify.  I'm looking into the possiblity to use offset channels calibrated values to get more precision.

    Thank you!

    Anand

    It seems that Ryan M addressed this in another of your posts:

    using calibration data for PXI-6551 as compensation for greater accuracy

  • guidance inertia temperature control

    Dear people,

    I would ask for advice from experienced people in control theory.

    Recently, I'm working on a very interesting device, it is called inertial guidance vacuum calorimeter (there is a type of isothermal calorimeter). I'm doing a new LabView program for this camera, and I try to increase its stability/accuracy as much as I can. This calorimeter can measure heat sample level micro-watt powers. The principle of measurement is the following (see attached diagram):

    We want to measure the heat flowing from the outside door-sample, this is done via Peltier (thermoelectric sensor) sensor. To obtain valid data, temperature must be very precisely constant everywhere in the calorimeter (isothermal method). The inner parts of the calorimeter thermally protected against ambient temperature with blank shields and high radiation. Double room empty closed circulates water through a heat exchanger, the heat reservoir.

    For thermal stability, there are 3 loops controlled:

    1 control loop. : this controls using PID, (method of classic platinum resistance 4W) constant water temperature

    2 control loop. : this is the first step to stabilize the temperature of the inner part, PID loop classic platinum resistance and drives a pump to Peltier heat between the support and the "base". This can achieve only a constancy of temperature around 0.1 mK.

    3 control loop. : when the ultimate stability is reached with the control loop 2, it is off, and his conduct of the current production constant value. Now the loop3 begins, and that's the tricky part: there is a cylindrical copper heavy 'inertial mass' with high calorific value (much higher then the base) standing on the base. Because it has a high calorific, even temperature fluctuations very tiny in the performance of temperature-controlled database a measurable voltage in the thermoelectric sensor between inertial mass and the base. This value is measured by the meter nanovoltage (Keithley). So I use this value to control the heat under the base pumps.  In principle, if there is no heat flowing between inertial mass and the base, we are in thermal equilibrium.

    With this concept, it is possible the stability of temperature range well below nanokelvins!

    Recently, I play with the control settings, but only using simple PID controls (I have the PID toolkit). The method explained above is a kind of analogy of inertial guidance in mechanical control. I wonder if someone could give me advice or direction how I could further improve the stability of this control. Perhaps some more advanced control system? Feed-forward?

    Thank you very much for the advice,

    Kind regards

    If I follow correctly, the feedback control loop 3 is measured using the Keithley via GPIB.  And these readings take approximately 0.6 seconds. The control loop cannot work faster than the measurement system (unless you want a lot more unstable!).  Most high precision instruments offer a compromise between resolution and speed.  I took a look at the profile of 2182 and it seems you may have need of any resolution, so accelerate may not be an option.

    The approach of bottle neck is probably a good with all that you have done so far.  It will help you to avoid putting too much time into something that won't make much improvement.

    Another thought: how noisy/drifty are power supplies/amplifiers driving Peltier devices?  I worked on a system several years ago, where the objective was stability uK ~ 10 and noise and drift in the power circuits were the main limitations.  Most of the engineers designing such devices never think of parts per million or fractions.  Especially with the slow return of nanovoltmeter, drift or noise in the power circuts could be important.  Also look at how the constant output mode that the external loops are open loop constantly.

    Lynn

  • Precision in the reading of temperature PT100 and Module of NI9219

    Hello

    I m the temperature reading with a PT100, connected to the NI9219 Module. I put LabVIEW on mode 4 son RTD (PT100 range). I m able to read the weather temperature but I m do not know the right precision s or not. The temperature is between 25.5 and 25.8 ° C. Is this possible? I am able to do something to have more precision?

    Thank you in advance,

    Claudia

    Hi Claudia,.

    The specifications for the 9217 provide the absolute measurement accuracy Pt 100 RTD (3 and 4 sons) on page 14, it is not necessary to calculate [1].

    Since the 9219 is more flexible in the types of measures that it can bring, absolute accuracy for each use case is not provided. Instead, OR gives error of gain values and offset to be used for the calculation of the absolute accuracy [2].

    Follow this logic to calculate the absolute accuracy:

    (Absolute accuracy) = (entry reading) * (Gain error + drift in temperature) + (Offset + drift in temperature error)

    Reading the line "RTD 4-wire or 3 wire Pt 100" in the table on page 24, the error of gain and offset for a module between 20 and 30 degrees Celsius is ±0, 1% of reading entry for gain and offset sectors ±2400ppm. Drift in temperature comes into play when the module is outside the normal operating temperature, but for Pt 100, gain drift is ±15ppm and the offset of the drift is the ±60ppm (see page 25).

    So, for a module at room temperature 23 degrees Celsius and a Pt100 to 100 degrees Celsius using the 505 Ohm input range, the calculation is:

    At 100 ° C, the RTD would measure 138.5 Ohms if its temperature coefficient 0,00385 Ohm/Ohm / ° C, then

    Absolute accuracy = (138.5 ohms * 0,1%) + (2400 ppm * 505 Ohm)
    = 1.212 Ohm + 0,1385 ohm
    = 1.3505 ohms

    fracture of 0,385 Ohm / ° C, and

    Absolute accuracy = ±3, 5 degrees Celsius

    [1] NI 9217 Operating Instructions and Specifications
    http://www.NI.com/PDF/manuals/374187c.PDF

    [2] NI 9219 specifications
    http://www.NI.com/PDF/manuals/374473b.PDF

    I hope that I can help you with this information.

    Best regards

    SUSE

  • power supply beeps and does not charge since located in a country where the average temperature of 38 ° c

    Since I started so that connected beep 3 months in Nicaragua (this voltage converter) my Compaq Presario CQ61 Notebook power and charge more.

    When you're disconnected beeps constantly.

    Is it possible that the problem is solved when the power supply cools at an ideal temperature or you hear a beep that it will be necessary to get a new charger?

    Thanks for your replies!

    The temperature is unlikely to be the cause of the problem.

    The power adapter (sometimes called power brick) is designed to operate in a wide range of voltages (from 100-120 Volts AC) or 220 - 240 Volts AC) the only thing that differs is the type of power connector used to connect the power adapter to a wall outlet of the home or office.

    Remove the converter and use the correct power cord or travel adapter Nicaraugua. Your adapter will not explode or be damaged. Converter is likely to be the issue.

    You need a North American power cord to the adapter. You should be able to buy one on site or just go into a travel store and buy a travel adapter set. That's what I have for when I travel to the United States for business or travel.  I bought my Brookstone set for about $30.00 USD.

  • GridFieldManager with label get wider

    I searched the forums, but can't find the answer to my problem. so, who can help me here.

    I have a GridFieldManager, with 2 containing static text and two labels with the text I want to change during execution.

    GridFieldManager gfm = new GridFieldManager(2, 2, Manager.USE_ALL_WIDTH | GridFieldManager.FIXED_SIZE);
    gfm.setColumnProperty(0, GridFieldManager.FIXED_SIZE, 300);
    gfm.setColumnProperty(1, GridFieldManager.FIXED_SIZE, 180);
    float TMeasured = 180 / 10;
    float TSetpoint = 195 / 10;
    lTemperatureMeasured = new LabelField(Float.toString(TMeasured) + "°c");
    lTemperatureSetpoint = new LabelField(Float.toString(TSetpoint) + "°c");
    gfm.add(new LabelField("Temperature measured"), FIELD_LEFT);
    gfm.add(lTemperatureMeasured, FIELD_LEFT);//RIGHT);
    gfm.add(new LabelField("Temperature setting"), FIELD_LEFT);
    gfm.add(lTemperatureSetpoint, FIELD_LEFT);//RIGHT);
    add(gfm);
    

    the problem is that when I change the text of the label, the GridFieldManager becomes wider! simply, it grows and grows by something like 5 pixels. change the fixed width did not help.

    I just can't understand why this happens. who does that?

    Text editing is done with a button:

    public void fieldChanged(Field field, int context)
    {
        if(field == bChangeTemperature)
        {
             lTemperatureSetpoint.setText("t");
        }
    }
    

  • Find the color temperature of light source with neutral card

    Hi all;

    I have an adjustable color LED strip that I would get as close as possible to 6700 Kelvin. I hope that I can take pictures of a neutral light card and set the LED color based on temperature/RGB values, I get Lightroom (5.7.1).  Is this possible? If anyone can help with the workflow on how to do it?

    Thank you very much

    I was afraid of that. The numbers only make sense for a radiator of black body like a tungsten light bulb or the Sun, which is a broadband light source. CFLs are generally pretty well made using a series of Phosphors of rare earths that are tuned to human vision and with emissions enough broadband. If you use several sources of monochromatic light as single color LEDs you could not get good numbers. It really depends on the wavelengths of light coming out of the LED. If the color on your camera filters are not the same wavelength of the LED and the sensitivity of three colors of the sensor of the camera is not the same as your eyes, you can get to a situation where your eye perceives neutral colors while the camera sees something completely not neutral. This is from your two raw files. Your illuminated LED Gray card probably looked to the eye, but to the camera, it looked purple. This effect is called Metameric failure and is common if the spectrum of light is very "spicy". This is probably what is happening in your case. You have certainly three narrow band LEDs that you mix. You are really unable to apply the model of the black body of "color temperature" unique to it and then you get strange results you observe. There are index to record high in special colours of the LEDs, which have much wider spectrum light show who have far fewer problems with Metameric failure and that should work for this purpose well.

  • LR 6.3 - How to set the color temperature in units of 50?

    Recently updated to 6.3 LR LR 5. In LR 5, I could click on the color temp box and drag left and right to adjust the temperature of color 50 increments of unit. Now, it will sometimes start 50 unit increments, but then if I drag further, it starts to change the setting in random numbers. I know that this is normal behavior for the actual slider, but I would get the 50 backstroke of function setting unit when clicking / sliding on the box color temp. Thanks for any help.

    If you use the upper and down arrows instead of the mouse, the increment is 50.

    If you do the more wide Panel, you get a finer granularity with the mouse, movements, but which removes the space in the picture display.

  • Wide Gamut RGB / sRGB

    What is the difference between these two? also, what are the merits using either?.

    I shot a lot of samples of solid color in RAW, cropped and resized in DPP, then transferred to PS8 for color and spotting. I don't know how, but about half the images of the latter are color when uploaded to my web gallery. the sRGB shots are okay, but all range plans lack of true colors and are dull. where range came from I do not know must have clicked the wrong box or something.

    Can I now convert the JPEG in the wide range in sRGB photos or I will have to start from raw files.

    More than 150 odd photos are involved with names and numbers of attatched to catch points and I would rather not do it again, so any help appreciated muchfully.

    I'm very new to all this and from dpp in PS and struggling to understand both made in my head.


    I begin to understand why the workflow is planned...

    I am trying to understand what are the photos you take for, always and the nature of your 'business', if that's what it is:

    Do you tiles for people, or are you just sell them the material to their own tiles, more a craft site?

    Do you take photos of as many of the "unlimited" colors you can for people to see what they look like on the slab?

    Do you take photographs of the current customer arrested as examples of potential customers?

    Do you take photographs of the current customer orders so that the customer can see their order and check that it's just before the production he for them?

    Is paper you make reference to a print of your photo, and is supposed to match what the customer sees on the website and it is part of a printed catalog or is it one per order for the customer to check from?

    I ask myself these questions as a way to understand how accurate the website Viewer expects that the color of the tile when it is installed, and if the audience of each photograph is all potential customers or just a particular customer.

    Finally, this floor or wall tile or decorative tile more colorful or artistic means more than as an accent or holder or something, not covering a large area?  It is to predict what sort of people lighting would generally consider their tiles in and judge whether your photos with incandescent lighting is typical of what would be a customer or not.

    --

    There are two separate issues with colors dull looking which are independent of each other: the first question is to have WideGamut jpg displayed in a browser that is assuming sRGB, where all the colors are dull.  It is very obvious that you posted on.  The other problem is that the reddish colors will look dull when photographed with reddish lighting which incandescent is compared in the light of the day.  This second question would be more subtle and would affect all the pictures tile, even if they use the correct sRGB profile.

    --

    You ask if using a color-checker or daylight bulbs would help: to help answer this, yourself, take a picture of some tiles colored in your configuration incandescent against your white background, then take the bottom and the tiles on the outside and take another picture of them in the Sun or in a sky covered entirely away from the colorful objects or buildings, and then compare the two side-by-side to see if you notice any changes of color across the tiles.

    To make the comparison more accurate, you should the white balance for the white background instead of use a WB setting appointed as the tungsten or daylight.  In DPP, you can set the WB by going the RAW tab in the editor and click on the eyedropper button and then clicking on the white background.  Doing for RAW photo incandescent and the sunny/cloudy RAW photo, then save as a JPG.  You may need to adjust the brightness or exposure) of other outside so he has the same brightness and contrast that inside, then save each as a JPG and compare the colors between each of them, before making adjustments in PSE.  If you do not see much difference between shooting incandescent and turned sunlight then a color-checker bulbs or light of day no doubt makes that much difference.  I would expect you will see differences in high heat how colors are relative to each other, when you go back between the two shots or you compare side-by-side, depending on what picture viewer program and monitor installation, but maybe don't you think that they are not significant enough to worry about.

    --

    A few comments on your methods to adjust the color you have described their:

    You say that you hold until you photographed next to your monitor and then adjust things to post-secondary studies until the colors are good:

    First of all, what is your lighting next to your computer that you can see the subject, it's the same as he was photographed, or something different, such as the window-curtain-daylight or fluorescent lamps?

    Secondly, your monitor calibrated somehow uses a spectrophotometer of material, or at least attempted to set the monitor's color temperature so white things on the screen look like white things in the immediate vicinity of the monitor as illuminated by some lights that you are viewing the subject while trying to match the colors?

    If the answer is no to the question monitor calibrated then you are really glad to create the colors look right to your own computer screen, which may or may not be the same that the other computer of the people watching who discovered the site.  Using a calibrated monitor, at least you know that any inaccuracies in the colors on a different monitor is due to the lack of this monitor calibration, not the sum of both your and the lack of the other calibration Viewer screen.

    When the color adjustment to post-secondary studies if you see the subject in a different light that you took the photo in then it will probably be difficult to get the colors to look right.

    Personally, I would use light daylight balanced for the photos, which would itself, with daylight, daylight bulbs (warm light) or diffuse / bounced flash, perhaps through a light tent.  The white balance, I got pipette to a neutral zone of photography.  I would use a monitor calibrated, and appeal to a standard profile in my RAW converter or create my own with the ColorChecker passport.  Finally, if I manually tweaking the colors by comparing the monitor for the subject is close, I'd use lighting daylight balanced during the display of the object.

    Without having a better understanding of how correct the color should be for each of your photos, my approach is perhaps too much work, and what you may be ok.  You could ask the opinion of other people who have access to the original object (the colorful mosaic that was photographed) by comparing it to the photo of this tile on the website on his own monitor in a different lighting situation: incandescent, daylight, fluorescent, so let's see how well the colors stand for them or not at all.

  • 34 "wide ultra LG and MBPR 15 'Sierra resolution problem '.

    Please advise, people

    34 "wide ultra LG and MBPR 15"NVIDIA GeForce OS Sierra do not support the native resolution.

    In fact, I have 25 "model UM25. However, Sierra recognizes as UM34 anyway, and I'm having exactly the same problem. I've updated to Sierra yesterday.

  • macOS Sierra no longer supports monitors ultra wide angle?

    Just install macOS Sierra and the required resolution for this monitor ultra wide LG no longer appears. Anyone know of a work around?

    My problem with my monitor LG UltraWide is now that it will expand just the 1920 x 1080 rather than on el capitan, where he could go to 2560 x 1080, seems like Apple just ditches ultra wide support

Maybe you are looking for

  • Can't send or receive sms on 6s

    GGot 6s another day and can not use sms someone help

  • How to open different files, as the case

    Hello I have 3 case within a case. When I select a first case, I need a certain VI to start. (I did by simply sliding the VI in this case) But when I select the 2nd case, I need an executable file to pop up, but I am not able to drag in the structure

  • Is function like the function of date and time picker in the ICB 8.5?

    I need the function as "date and time picker control" used in activeX, but my operating system (windows 7 Bits without office software) is not the «Microsoft date controller» and time picker I am looking article in the internet which mention this reg

  • Outlook 2003

    I would like to know if I can display both my e-mails and my calendar at the same time on one screen, I think I saw it in new versions of Outlook.  Thank you.

  • Best graphics card for Inspiron 3647?

    I don't know how to find it on the Dell website, but I want to put the best video card as I can in the desktop computer Inspiron 3647. I would rather have 1 to 2 GB RAM, video card, but I don't know that is even supported or if I can pass the power s